Mistral AI
Mistral AI provides cutting-edge large language models with excellent performance for text generation, reasoning, and specialized tasks like code generation.
This guide explains how to integrate Opik with Mistral AI via LiteLLM. By using the LiteLLM integration provided by Opik, you can easily track and evaluate your Mistral API calls within your Opik projects as Opik will automatically log the input prompt, model used, token usage, and response generated.
Getting Started
Configuring Opik
To start tracking your Mistral AI LLM calls, you’ll need to have both opik
and litellm
installed. You can install them using pip:
In addition, you can configure Opik using the opik configure
command which will prompt you for the correct local server address or if you are using the Cloud platform your API key:
Configuring Mistral AI
You’ll need to set your Mistral AI API key as an environment variable:
Logging LLM calls
In order to log the LLM calls to Opik, you will need to create the OpikLogger callback. Once the OpikLogger callback is created and added to LiteLLM, you can make calls to LiteLLM as you normally would:
Logging LLM calls within a tracked function
If you are using LiteLLM within a function tracked with the @track
decorator, you will need to pass the current_span_data
as metadata to the litellm.completion
call: