xAI Grok
xAI is an AI company founded by Elon Musk that develops the Grok series of large language models. Grok models are designed to have access to real-time information and are built with a focus on truthfulness, competence, and maximum benefit to humanity.
This guide explains how to integrate Opik with xAI Grok via LiteLLM. By using the LiteLLM integration provided by Opik, you can easily track and evaluate your xAI API calls within your Opik projects as Opik will automatically log the input prompt, model used, token usage, and response generated.
Getting Started
Configuring Opik
To get started, you need to configure Opik to send traces to your Comet project. You can do this by setting the OPIK_PROJECT_NAME
environment variable:
You can also call the opik.configure
method:
Configuring LiteLLM
Install the required packages:
Create a LiteLLM configuration file (e.g., litellm_config.yaml
):
Authentication
Set your xAI API key as an environment variable:
You can obtain an xAI API key from the xAI Console.
Usage
Using LiteLLM Proxy Server
Start the LiteLLM proxy server:
Use the proxy server to make requests:
Direct Integration
You can also use LiteLLM directly in your Python code:
Supported Models
xAI provides access to several Grok model variants:
- Grok Beta:
grok-beta
- The main conversational AI model with real-time information access - Grok Vision Beta:
grok-vision-beta
- Multimodal model capable of processing text and images - Grok Mini:
grok-mini
- A smaller, faster variant optimized for simpler tasks
For the most up-to-date list of available models, visit the xAI API documentation.
Real-time Information Access
One of Grok’s key features is its ability to access real-time information. This makes it particularly useful for questions about current events:
Vision Capabilities
Grok Vision Beta can process both text and images:
Function Calling
Grok models support function calling for enhanced capabilities:
Advanced Features
Temperature and Creativity Control
Control the creativity of Grok’s responses:
System Messages for Behavior Control
Use system messages to guide Grok’s behavior:
Feedback Scores and Evaluation
Once your xAI calls are logged with Opik, you can evaluate your LLM application using Opik’s evaluation framework:
Environment Variables
Make sure to set the following environment variables: