Observability for AISuite with Opik

This guide explains how to integrate Opik with the aisuite Python SDK. By using the track_aisuite method provided by opik, you can easily track and evaluate your aisuite API calls within your Opik projects as Opik will automatically log the input prompt, model used, token usage, and response generated.

Account Setup

Comet provides a hosted version of the Opik platform, simply create an account and grab your API Key.

You can also run the Opik platform locally, see the installation guide for more information.

Getting Started

Installation

First, ensure you have both opik and aisuite packages installed:

$pip install opik "aisuite[openai]"

Configuring Opik

Configure the Opik Python SDK for your deployment type. See the Python SDK Configuration guide for detailed instructions on:

  • CLI configuration: opik configure
  • Code configuration: opik.configure()
  • Self-hosted vs Cloud vs Enterprise setup
  • Configuration files and environment variables

Configuring AISuite

In order to configure AISuite, you will need to have your OpenAI API Key. You can find or create your OpenAI API Key in this page.

You can set it as an environment variable:

$export OPENAI_API_KEY="YOUR_API_KEY"

Or set it programmatically:

1import os
2import getpass
3
4if "OPENAI_API_KEY" not in os.environ:
5 os.environ["OPENAI_API_KEY"] = getpass.getpass("Enter your OpenAI API key: ")

Logging LLM calls

In order to log the LLM calls to Opik, you will need to wrap the AISuite client with track_aisuite. When making calls with that wrapped client, all calls will be logged to Opik:

1from opik.integrations.aisuite import track_aisuite
2import aisuite as ai
3
4client = track_aisuite(ai.Client(), project_name="aisuite-integration-demo")
5
6messages = [
7 {"role": "user", "content": "Write a short two sentence story about Opik."},
8]
9
10response = client.chat.completions.create(
11 model="openai:gpt-4o",
12 messages=messages,
13 temperature=0.75
14)
15print(response.choices[0].message.content)

Advanced Usage

Using with the @track decorator

If you have multiple steps in your LLM pipeline, you can use the @track decorator to log the traces for each step. If AISuite is called within one of these steps, the LLM call will be associated with that corresponding step:

1from opik import track
2from opik.integrations.aisuite import track_aisuite
3import aisuite as ai
4
5client = track_aisuite(ai.Client(), project_name="aisuite-integration-demo")
6
7@track
8def generate_story(prompt):
9 res = client.chat.completions.create(
10 model="openai:gpt-3.5-turbo", messages=[{"role": "user", "content": prompt}]
11 )
12 return res.choices[0].message.content
13
14@track
15def generate_topic():
16 prompt = "Generate a topic for a story about Opik."
17 res = client.chat.completions.create(
18 model="openai:gpt-3.5-turbo", messages=[{"role": "user", "content": prompt}]
19 )
20 return res.choices[0].message.content
21
22@track(project_name="aisuite-integration-demo")
23def generate_opik_story():
24 topic = generate_topic()
25 story = generate_story(topic)
26 return story
27
28# Execute the multi-step pipeline
29generate_opik_story()

The trace can now be viewed in the UI with hierarchical spans showing the relationship between different steps:

Supported aisuite methods

The track_aisuite wrapper supports the following aisuite methods:

  • aisuite.Client.chat.completions.create()

If you would like to track another aisuite method, please let us know by opening an issue on GitHub.