Introduction to Solar
Last updated
Last updated
π Table of Contents
Why Solar?
Solar Mini vs Solar Pro
LLM Components
Demo: Try Solar at the Playground
What is Prompt Engineering?
Upstageβs Solar is a high-performance LLM solution designed for anyone to use easily.
Upstage offers two models to meet diverse user needs: Solar Mini and Solar Pro.
Each model is optimized for different environments and is designed for real-world business applications.
Solar Mini is a lightweight LLM that delivers fast and efficient performance.
β Fast processing speed, ideal for real-time AI services
β On-device execution, allowing AI to run locally on user devices
β Easier optimization, enabling tailored applications for specific domains or services
β Reduced GPU resource consumption, leading to cost savings and efficient operations
"The most intelligent LLM on a single GPU."
Solar Pro delivers performance comparable to a 70B+ model on a single GPU.
Designed to meet business-specific needs, it excels in structured document processing, multilingual support, and domain-specific expertise.
β Delivers 70B+ model-level performance on a single GPU
β Excels in English, Korean, and Japanese, making it a truly multilingual model
β Highly effective in specialized fields, such as finance, healthcare, and legal industries
β Generates structured AI responses, such as JSON outputs for automation
Summary
β Solar Mini is ideal for lightweight AI applications.
β Solar Pro is suitable for high-performance AI needs in domains like finance, legal, and healthcare.
Before using an LLM, it is crucial to understand its core components and key configuration considerations.
1οΈβ£ User Input
Definition: The question or request a user sends to the LLM.
Example:
"What is the weather like in Seoul today?"
2οΈβ£ Prompt
Definition: Instructions or examples provided along with the input to guide the AI modelβs response.
Example:
Instruction: "You are a weather expert. Provide the weather forecast for a requested location in a friendly tone, following this format:"
Format: "Today's weather in [location]
is [condition]
with a temperature of [xx]Β°C
."
Example: "Today's weather in New York is rainy with a temperature of 12Β°C."
3οΈβ£ Output
Definition: The response generated by the LLM based on the input and prompt.
Example:
"Today's weather in Seoul is sunny with a temperature of 25Β°C."
To use LLMs effectively, it is essential to control output length and response style using various configuration settings.
Definition: The maximum number of tokens the model can generate in a response.
What is a Token?
LLMs process text in tokens rather than words.
A token can be a word, article, or even punctuation.
Example: "Hello, nice to meet you."
Tokens: ["Hello", ",", "nice", "to", "meet", "you", "."]
β 7 tokens
π¨ Important Notes:
The Max Tokens setting controls the length of the response.
The maximum limit is 16,000 tokens (16K).
If set too low, the response may be cut off; if too high, the output may be unnecessarily long.
π‘ Tip:
Adjust Max Tokens based on the desired response length and level of detail.
Definition: A value between 0 and 1 that controls the diversity and creativity of responses.
Characteristics:
Lower values (closer to 0): More precise and predictable responses.
Ideal for technical documents, data analysis, and factual explanations.
Higher values (closer to 1): More creative and varied responses.
Suitable for marketing copy, storytelling, and brainstorming.
Now, letβs try using the Solar LLM in practice!
A real-time LLM chatbot environment provided by Upstage.
Allows users to experiment with text generation, document summarization, translation, and more.
Designed for both developers and non-developers.
Access the Playground and test Solar models
Enter simple prompts and analyze responses
Adjust settings like model selection and temperature
Access the Playground
Select a Model
Choose Solar Mini or Solar Pro.
Enter a Prompt
Adjust Parameters
Max Tokens: Set the response length.
Temperature: Adjust response creativity.
Review and Compare Responses
Test different models and settings to observe variations.
Prompt Engineering is the art of designing optimal prompts to help LLMs generate accurate and useful responses.
Crafting a well-structured prompt can enhance AI-generated outputs, improving productivity and efficiency.
Be Clear and Specific
β "Where should I travel?"
β "Recommend three scenic international destinations for August."
Provide Context
β "Write an ad for my company."
β "Create a 50-character LinkedIn/Twitter ad for 'Solar LLM,' highlighting its performance and cost-effectiveness."
Assign a Role
β "You are a financial analyst. Provide a long-term investment strategy for 2024."
Specify Output Format
β "Summarize my budget plan in a table format by category (food, transport, leisure)."
Provide Examples
β "Write an ad similar to βSign up now and get 20% off!β"
β Bad Prompt Example
"Create a marketing strategy for a new product launch.β
The scope is too broad, and key details like target market, budget, and channels are unclear.
β Good Prompt Example
[Role Assignment] You are a global marketing consultant.
[Context]
A startup has developed an AI-powered customer support solution and is preparing to launch a new product. Their primary customers are small businesses, and their goal is to attract as many companies as possible within a limited budget.
Based on the following details, create a 6-month marketing strategy:
Target Market: North America, Europe
Marketing Budget: $500,000
Key Channels: LinkedIn ads, Google search ads, email marketing
Goal: Acquire 500 business clients within the first 3 months
[Output Format] Please structure your response as follows:
Market Analysis: Compare AI customer support trends and key competitors in a table.
Marketing Strategy: Split into the first 3 months and the next 3 months with specific action plans.
Budget Allocation: Breakdown of marketing spend by channel (%)
Performance Metrics (KPIs): Click-through rate, new customer acquisition, website visits
[Example Reference] Include a sample excerpt from a previous marketing strategy document.
Ensure the response follows this format.
By structuring prompts effectively, AI can deliver more precise and useful responses.
Prompt engineering is not about getting a perfect result in a single attempt, but rather an iterative process of refinement and optimization.
Adjusting Style: Modify the tone or writing style of responses.
Tuning Response Length: Control the length of responses to avoid them being too short or too long.
Emphasizing Key Information: Highlight specific aspects such as price, features, or important details.
Providing Examples: Include sample outputs similar to the desired response.
Reiterating Key Information: Reinforce crucial points at the end of the prompt.
Considering English Inputs: LLMs may generate more precise responses when prompts are in English.
Using Markdown: Structure outputs using lists or tables for better readability.
π‘ Experiment with these techniques in different ways to discover the most effective prompt for your needs!
The Solar Prompt Cookbook is a comprehensive guide designed to help you craft effective prompts for Upstage's Solar model.
It explains the differences between Prompt Engineering and Fine-Tuning, offering a hands-on approach that covers everything from basic to advanced techniques. By comparing poor and well-crafted prompt examples, the cookbook guides you in optimizing your prompt design for the best results.
πΉ Solar Mini: Lightweight and fast for efficient AI solutions.
πΉ Solar Pro: High-performance AI tailored for specialized domains.
πΉ LLM Configuration: Understanding input, prompt, output, max tokens, and temperature.
πΉ Playground Practice: Hands-on experience using Solar models.
πΉ Prompt Engineering: Effective techniques for crafting precise prompts.
YoungHoon Jeon | AI Edu | Upstage
π‘ Solar Mini is lightweight yet powerful, offering flexible applications across various environments. π
π‘ Solar Pro provides a trusted AI solution for enterprises and is suitable for handling complex data across multiple industries. π
π
Ready to start designing prompts with Solar? π