Cohere

Cohere provides state-of-the-art large language models that excel at text generation, summarization, classification, and retrieval-augmented generation.

This guide explains how to integrate Opik with Cohere using the OpenAI SDK Compatibility API. By using the track_openai method provided by opik with Cohere’s compatibility endpoint, you can easily track and evaluate your Cohere API calls within your Opik projects as Opik will automatically log the input prompt, model used, token usage, and response generated.

Getting Started

Configuring Opik

To start tracking your Cohere LLM calls, you’ll need to have both opik and openai packages installed. You can install them using pip:

$pip install opik openai

In addition, you can configure Opik using the opik configure command which will prompt you for the correct local server address or if you are using the Cloud platform your API key:

$opik configure

Configuring Cohere

You’ll need to set your Cohere API key as an environment variable:

{pytest_codeblocks_skip=true}
$export COHERE_API_KEY="YOUR_API_KEY"

Tracking Cohere API calls

Leverage the OpenAI Compatibility API by replacing the base URL with Cohere’s endpoint when initializing the client:

1import os
2from opik.integrations.openai import track_openai
3from openai import OpenAI
4
5client = OpenAI(
6 api_key=os.environ.get("COHERE_API_KEY"),
7 base_url="https://api.cohere.ai/compatibility/v1" # Cohere Compatibility API endpoint
8)
9
10client = track_openai(client)
11
12response = client.chat.completions.create(
13 model="command-r7b-12-2024", # Replace with the desired Cohere model
14 messages=[
15 {"role": "system", "content": "You are an assistant."},
16 {"role": "user", "content": "Why is tracking and evaluation of LLMs important?"}
17 ],
18 temperature=0.7,
19 max_tokens=100
20)
21
22print(response.choices[0].message.content)

The track_openai will automatically track and log the API call, including the input prompt, model used, and response generated. You can view these logs in your Opik project dashboard.

Using Cohere within a tracked function

If you are using Cohere within a function tracked with the @track decorator, you can use the tracked client as normal:

1from opik import track
2from opik.integrations.openai import track_openai
3from openai import OpenAI
4import os
5
6client = OpenAI(
7 api_key=os.environ.get("COHERE_API_KEY"),
8 base_url="https://api.cohere.ai/compatibility/v1"
9)
10tracked_client = track_openai(client)
11
12@track
13def generate_story(prompt):
14 response = tracked_client.chat.completions.create(
15 model="command-r7b-12-2024",
16 messages=[
17 {"role": "user", "content": prompt}
18 ]
19 )
20 return response.choices[0].message.content
21
22@track
23def generate_topic():
24 prompt = "Generate a topic for a story about Opik."
25 response = tracked_client.chat.completions.create(
26 model="command-r7b-12-2024",
27 messages=[
28 {"role": "user", "content": prompt}
29 ]
30 )
31 return response.choices[0].message.content
32
33@track
34def generate_opik_story():
35 topic = generate_topic()
36 story = generate_story(topic)
37 return story
38
39generate_opik_story()

Supported Cohere models

The track_openai wrapper with Cohere’s compatibility API supports the following Cohere models:

  • command-r7b-12-2024 - Command R 7B model
  • command-r-plus - Command R Plus model
  • command-r - Command R model
  • command-light - Command Light model
  • command - Command model

Supported OpenAI methods

The track_openai wrapper supports the following OpenAI methods when used with Cohere:

  • client.chat.completions.create(), including support for stream=True mode
  • client.beta.chat.completions.parse()
  • client.beta.chat.completions.stream()
  • client.responses.create()

If you would like to track another OpenAI method, please let us know by opening an issue on GitHub.