How can I provide a 'system message' to models with Groq API?

The system message (or system prompt) is an optional parameter that helps set the context, behavior, and persona for the AI model throughout the conversation. It’s crucial to place it as the very first message in the array for it to be correctly interpreted by most models. For speech-to-text models like Whisper, the system prompt can be used to adjust for pronounciations or spellings.

To provide a system message when using models like llama-3.3-70b-versatile with Groq API, include it as the first message object in the messages array of your API request. The message object should have "role" and "content".

Here’s an example structure for the messages array:

[
  {
    "role": "system",
    "content": "You are a helpful assistant that provides concise answers."
  },
  {
    "role": "user",
    "content": "What is the capital of Italy?"
  }
  // ... potentially more user/assistant messages for conversation history
]

In a Python script using the groq library:

from groq import Groq
import os

client = Groq(api_key=os.environ.get("GROQ_API_KEY"))

chat_completion = client.chat.completions.create(
    messages=[
        {
            "role": "system",
            "content": "You are an expert on astrophysics and will explain concepts simply."
        },
        {
            "role": "user",
            "content": "Tell me about black holes."
        }
    ],
    model="llama-3.3-70b-versatile"  # Or your desired model ID
)

print(chat_completion.choices[0].message.content)