Observability for Anthropic with Opik

Anthropic is an AI safety and research company that’s working to build reliable, interpretable, and steerable AI systems.

This guide explains how to integrate Opik with the Anthropic Python SDK. By using the track_anthropic method provided by opik, you can easily track and evaluate your Anthropic API calls within your Opik projects as Opik will automatically log the input prompt, model used, token usage, and response generated.

Account Setup

Comet provides a hosted version of the Opik platform, simply create an account and grab your API Key.

You can also run the Opik platform locally, see the installation guide for more information.

Getting Started

Installation

To start tracking your Anthropic LLM calls, you’ll need to have both the opik and anthropic packages. You can install them using pip:

$pip install opik anthropic

Configuring Opik

Configure the Opik Python SDK for your deployment type. See the Python SDK Configuration guide for detailed instructions on:

  • CLI configuration: opik configure
  • Code configuration: opik.configure()
  • Self-hosted vs Cloud vs Enterprise setup
  • Configuration files and environment variables

Configuring Anthropic

In order to configure Anthropic, you will need to have your Anthropic API Key set. You can find or create your Anthropic API Key in this page.

You can set it as an environment variable:

{pytest_codeblocks_skip=true}
$export ANTHROPIC_API_KEY="YOUR_API_KEY"

Or set it programmatically:

1import os
2import getpass
3
4if "ANTHROPIC_API_KEY" not in os.environ:
5 os.environ["ANTHROPIC_API_KEY"] = getpass.getpass("Enter your Anthropic API key: ")

Logging LLM calls

In order to log the LLM calls to Opik, you will need to create the wrap the anthropic client with track_anthropic. When making calls with that wrapped client, all calls will be logged to Opik:

1import anthropic
2from opik.integrations.anthropic import track_anthropic
3
4anthropic_client = anthropic.Anthropic()
5anthropic_client = track_anthropic(anthropic_client, project_name="anthropic-integration-demo")
6
7PROMPT = "Why is it important to use a LLM Monitoring like CometML Opik tool that allows you to log traces and spans when working with Anthropic LLM Models?"
8
9response = anthropic_client.messages.create(
10 model="claude-3-5-sonnet-20241022",
11 max_tokens=1024,
12 messages=[
13 {"role": "user", "content": PROMPT}
14 ]
15)
16print("Response", response.content[0].text)

Advanced Usage

Using with the @track decorator

If you have multiple steps in your LLM pipeline, you can use the @track decorator to log the traces for each step. If Anthropic is called within one of these steps, the LLM call will be associated with that corresponding step:

1import anthropic
2from opik import track
3from opik.integrations.anthropic import track_anthropic
4
5os.environ["OPIK_PROJECT_NAME"] = "anthropic-integration-demo"
6
7anthropic_client = anthropic.Anthropic()
8anthropic_client = track_anthropic(anthropic_client)
9
10@track
11def generate_story(prompt):
12 res = anthropic_client.messages.create(
13 model="claude-3-5-sonnet-20241022",
14 max_tokens=1024,
15 messages=[{"role": "user", "content": prompt}],
16 )
17 return res.content[0].text
18
19@track
20def generate_topic():
21 prompt = "Generate a topic for a story about Opik."
22 res = anthropic_client.messages.create(
23 model="claude-3-5-sonnet-20241022",
24 max_tokens=1024,
25 messages=[{"role": "user", "content": prompt}],
26 )
27 return res.content[0].text
28
29@track
30def generate_opik_story():
31 topic = generate_topic()
32 story = generate_story(topic)
33 return story
34
35# Execute the multi-step pipeline
36generate_opik_story()

The trace can now be viewed in the UI with hierarchical spans showing the relationship between different steps:

Cost Tracking

The track_anthropic wrapper automatically tracks token usage and cost for all supported Anthropic models.

Cost information is automatically captured and displayed in the Opik UI, including:

  • Token usage details
  • Cost per request based on Anthropic pricing
  • Total trace cost

View the complete list of supported models and providers on the Supported Models page.