πŸš€
Upstage EduStage
  • Welcome
  • Getting Started
    • Introduction to Upstage x AWS AI Initiative
    • Getting Started with Upstage Edu Full Package
  • Basics
    • [KOR] Edu Full Package - No-Code Zone
      • Introduction to LLM
      • Capabilities of LLM
      • Introduction to Solar
      • Introduction to Embedding
      • Introduction to Document AI
      • Introduction to Document Parse
    • [KOR] Edu Full Package - Dev Start Zone
      • Introduction to Upstage API
      • Getting Started with Solar Chat
      • Getting Started withΒ Document Digitization
    • [KOR] Edu Full Package - Use Case Zone
      • Introduction to RAG
      • Introduction to AI Agent
    • [ENG] Edu Full Package - No-Code Zone
      • Introduction to LLM
      • Capabilities of LLM
      • Introduction to Solar
      • Introduction to Embedding
      • Introduction to Document AI
      • Introduction to Document Parse
    • [ENG] Edu Full Package - Dev Start Zone
      • Introduction to Upstage API
      • Getting Started with Solar Chat
      • Getting Started with Document Digitization
    • [ENG] Edu Full Package - Use Case Zone
      • Introduction to RAG
      • Introduction to AI Agent
Powered by GitBook
On this page
  • 1. Why Solar?
  • πŸ€– Solar Mini: Compact and Powerful
  • πŸ€– Solar Pro: The Most Intelligent LLM on a Single GPU
  • 2. LLM Components
  • πŸ’‘ LLM Components
  • πŸ” Key Considerations When Using LLMs
  • 3. πŸ› οΈ Demo: Exploring Solar in the Upstage Playground
  • What is Upstage Console Playground?
  • 4. What is Prompt Engineering?
  • πŸ’‘ Best Practices for Prompt Engineering
  • πŸ’¬ Prompt Examples
  • πŸ› οΈ Continuous Testing and Optimization Are Essential!
  • Wrap Up
  1. Basics
  2. [ENG] Edu Full Package - No-Code Zone

Introduction to Solar

PreviousCapabilities of LLMNextIntroduction to Embedding

Last updated 2 months ago

πŸ“Œ Table of Contents

  • Why Solar?

  • Solar Mini vs Solar Pro

  • LLM Components

  • Demo: Try Solar at the Playground

  • What is Prompt Engineering?

Upstage’s Solar is a high-performance LLM solution designed for anyone to use easily.

1. Why Solar?

Upstage offers two models to meet diverse user needs: Solar Mini and Solar Pro.

Each model is optimized for different environments and is designed for real-world business applications.

πŸ€– Solar Mini: Compact and Powerful

  • Solar Mini is a lightweight LLM that delivers fast and efficient performance.

Why Solar Mini

βœ” Fast processing speed, ideal for real-time AI services

βœ” On-device execution, allowing AI to run locally on user devices

βœ” Easier optimization, enabling tailored applications for specific domains or services

βœ” Reduced GPU resource consumption, leading to cost savings and efficient operations

πŸ€– Solar Pro: The Most Intelligent LLM on a Single GPU

"The most intelligent LLM on a single GPU."

Solar Pro delivers performance comparable to a 70B+ model on a single GPU.

Designed to meet business-specific needs, it excels in structured document processing, multilingual support, and domain-specific expertise.

Why Solar Pro

βœ” Delivers 70B+ model-level performance on a single GPU

βœ” Excels in English, Korean, and Japanese, making it a truly multilingual model

βœ” Highly effective in specialized fields, such as finance, healthcare, and legal industries

βœ” Generates structured AI responses, such as JSON outputs for automation

Summary

βœ” Solar Mini is ideal for lightweight AI applications.

βœ” Solar Pro is suitable for high-performance AI needs in domains like finance, legal, and healthcare.

2. LLM Components

Before using an LLM, it is crucial to understand its core components and key configuration considerations.

πŸ’‘ LLM Components

1️⃣ User Input

  • Definition: The question or request a user sends to the LLM.

  • Example:

    • "What is the weather like in Seoul today?"

2️⃣ Prompt

  • Definition: Instructions or examples provided along with the input to guide the AI model’s response.

  • Example:

    • Instruction: "You are a weather expert. Provide the weather forecast for a requested location in a friendly tone, following this format:"

    • Format: "Today's weather in [location] is [condition] with a temperature of [xx]Β°C."

    • Example: "Today's weather in New York is rainy with a temperature of 12Β°C."

3️⃣ Output

  • Definition: The response generated by the LLM based on the input and prompt.

  • Example:

    • "Today's weather in Seoul is sunny with a temperature of 25Β°C."

πŸ” Key Considerations When Using LLMs

To use LLMs effectively, it is essential to control output length and response style using various configuration settings.

Max Tokens (Maximum Output Length)

  • Definition: The maximum number of tokens the model can generate in a response.

  • What is a Token?

    • LLMs process text in tokens rather than words.

    • A token can be a word, article, or even punctuation.

    • Example: "Hello, nice to meet you."

      • Tokens: ["Hello", ",", "nice", "to", "meet", "you", "."] β†’ 7 tokens

  • 🚨 Important Notes:

    • The Max Tokens setting controls the length of the response.

    • The maximum limit is 16,000 tokens (16K).

    • If set too low, the response may be cut off; if too high, the output may be unnecessarily long.

  • πŸ’‘ Tip:

    • Adjust Max Tokens based on the desired response length and level of detail.

Temperature (Creativity Control)

  • Definition: A value between 0 and 1 that controls the diversity and creativity of responses.

  • Characteristics:

    • Lower values (closer to 0): More precise and predictable responses.

      • Ideal for technical documents, data analysis, and factual explanations.

    • Higher values (closer to 1): More creative and varied responses.

      • Suitable for marketing copy, storytelling, and brainstorming.

Now, let’s try using the Solar LLM in practice!

3. πŸ› οΈ Demo: Exploring Solar in the Upstage Playground

What is Upstage Console Playground?

  • A real-time LLM chatbot environment provided by Upstage.

  • Allows users to experiment with text generation, document summarization, translation, and more.

  • Designed for both developers and non-developers.

πŸ“Œ Hands-On Goals

  • Access the Playground and test Solar models

  • Enter simple prompts and analyze responses

  • Adjust settings like model selection and temperature

πŸ’‘ How to Use the Playground

  1. Access the Playground

  2. Select a Model

    • Choose Solar Mini or Solar Pro.

  3. Enter a Prompt

  4. Adjust Parameters

    • Max Tokens: Set the response length.

    • Temperature: Adjust response creativity.

  5. Review and Compare Responses

    • Test different models and settings to observe variations.

4. What is Prompt Engineering?

Prompt Engineering is the art of designing optimal prompts to help LLMs generate accurate and useful responses.

Crafting a well-structured prompt can enhance AI-generated outputs, improving productivity and efficiency.

πŸ’‘ Best Practices for Prompt Engineering

  1. Be Clear and Specific

    • ❌ "Where should I travel?"

    • βœ… "Recommend three scenic international destinations for August."

  2. Provide Context

    • ❌ "Write an ad for my company."

    • βœ… "Create a 50-character LinkedIn/Twitter ad for 'Solar LLM,' highlighting its performance and cost-effectiveness."

  3. Assign a Role

    • βœ… "You are a financial analyst. Provide a long-term investment strategy for 2024."

  4. Specify Output Format

    • βœ… "Summarize my budget plan in a table format by category (food, transport, leisure)."

  5. Provide Examples

    • βœ… "Write an ad similar to β€˜Sign up now and get 20% off!’"

πŸ’¬ Prompt Examples

❌ Bad Prompt Example

"Create a marketing strategy for a new product launch.”

The scope is too broad, and key details like target market, budget, and channels are unclear.

β­• Good Prompt Example

[Role Assignment] You are a global marketing consultant.

[Context]

A startup has developed an AI-powered customer support solution and is preparing to launch a new product. Their primary customers are small businesses, and their goal is to attract as many companies as possible within a limited budget.

Based on the following details, create a 6-month marketing strategy:

  • Target Market: North America, Europe

  • Marketing Budget: $500,000

  • Key Channels: LinkedIn ads, Google search ads, email marketing

  • Goal: Acquire 500 business clients within the first 3 months

[Output Format] Please structure your response as follows:

  1. Market Analysis: Compare AI customer support trends and key competitors in a table.

  2. Marketing Strategy: Split into the first 3 months and the next 3 months with specific action plans.

  3. Budget Allocation: Breakdown of marketing spend by channel (%)

  4. Performance Metrics (KPIs): Click-through rate, new customer acquisition, website visits

[Example Reference] Include a sample excerpt from a previous marketing strategy document.

Ensure the response follows this format.

By structuring prompts effectively, AI can deliver more precise and useful responses.

πŸ› οΈ Continuous Testing and Optimization Are Essential!

Prompt engineering is not about getting a perfect result in a single attempt, but rather an iterative process of refinement and optimization.

  • Adjusting Style: Modify the tone or writing style of responses.

  • Tuning Response Length: Control the length of responses to avoid them being too short or too long.

  • Emphasizing Key Information: Highlight specific aspects such as price, features, or important details.

  • Providing Examples: Include sample outputs similar to the desired response.

  • Reiterating Key Information: Reinforce crucial points at the end of the prompt.

  • Considering English Inputs: LLMs may generate more precise responses when prompts are in English.

  • Using Markdown: Structure outputs using lists or tables for better readability.

πŸ’‘ Experiment with these techniques in different ways to discover the most effective prompt for your needs!

The Solar Prompt Cookbook is a comprehensive guide designed to help you craft effective prompts for Upstage's Solar model.

It explains the differences between Prompt Engineering and Fine-Tuning, offering a hands-on approach that covers everything from basic to advanced techniques. By comparing poor and well-crafted prompt examples, the cookbook guides you in optimizing your prompt design for the best results.

Wrap Up

πŸ”Ή Solar Mini: Lightweight and fast for efficient AI solutions.

πŸ”Ή Solar Pro: High-performance AI tailored for specialized domains.

πŸ”Ή LLM Configuration: Understanding input, prompt, output, max tokens, and temperature.

πŸ”Ή Playground Practice: Hands-on experience using Solar models.

πŸ”Ή Prompt Engineering: Effective techniques for crafting precise prompts.

YoungHoon Jeon | AI Edu | Upstage

πŸ’‘ Solar Mini is lightweight yet powerful, offering flexible applications across various environments. πŸ”—

πŸ’‘ Solar Pro provides a trusted AI solution for enterprises and is suitable for handling complex data across multiple industries. πŸ”—

πŸš€

Solar Prompt Cookbook

Ready to start designing prompts with Solar? πŸš€

πŸ“š
Read more about Solar Mini
Read more about Solar Pro
Try the Playground Now
Try the Playground Now
GitHub - UpstageAI/solar-prompt-cookbook: Solar Prompt CookbookGitHub
Logo
Components of LLMs