Observability for Pydantic AI with Opik

Pydantic AI is a Python agent framework designed to build production grade applications with Generative AI.

Pydantic AI’s primary advantage is its integration of Pydantic’s type-safe data validation, ensuring structured and reliable responses in AI applications.

Account Setup

Comet provides a hosted version of the Opik platform, simply create an account and grab your API Key.

You can also run the Opik platform locally, see the installation guide for more information.

Getting Started

Installation

To use the Pydantic AI integration with Opik, you will need to have Pydantic AI and logfire installed:

$pip install --upgrade pydantic-ai logfire 'logfire[httpx]'

Configuring Pydantic AI

In order to use Pydantic AI, you will need to configure your LLM provider API keys. For this example, we’ll use OpenAI. You can find or create your API keys in these pages:

You can set them as environment variables:

$export OPENAI_API_KEY="YOUR_API_KEY"

Or set them programmatically:

1import os
2import getpass
3
4if "OPENAI_API_KEY" not in os.environ:
5 os.environ["OPENAI_API_KEY"] = getpass.getpass("Enter your OpenAI API key: ")

Configuring OpenTelemetry

You will need to set the following environment variables to make sure the data is logged to Opik:

If you are using Opik Cloud, you will need to set the following environment variables:

$export OTEL_EXPORTER_OTLP_ENDPOINT=https://www.comet.com/opik/api/v1/private/otel
>export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default'

To log the traces to a specific project, you can add the projectName parameter to the OTEL_EXPORTER_OTLP_HEADERS environment variable:

$export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default,projectName=<your-project-name>'

You can also update the Comet-Workspace parameter to a different value if you would like to log the data to a different workspace.

Using Opik with Pydantic AI

To track your Pydantic AI agents, you will need to configure logfire as this is the framework used by Pydantic AI to enable tracing.

1import logfire
2
3logfire.configure(
4 send_to_logfire=False,
5)
6logfire.instrument_httpx(capture_all=True)

Practical Example

Now that everything is configured, you can create and run Pydantic AI agents:

1import nest_asyncio
2from pydantic_ai import Agent
3
4# Enable async support in Jupyter notebooks
5nest_asyncio.apply()
6
7# Create a simple agent
8agent = Agent(
9 "openai:gpt-4o",
10 system_prompt="Be concise, reply with one sentence.",
11)
12
13# Run the agent
14result = agent.run_sync('Where does "hello world" come from?')
15print(result.data)
Pydantic AI tracing

Advanced usage

You can reduce the amount of data logged to Opik by setting capture_all to False:

1import logfire
2
3logfire.configure(
4 send_to_logfire=False,
5)
6logfire.instrument_httpx(capture_all=False)

When this parameter is set to False, we will not log the exact request made to the LLM provider.

Further improvements

If you would like to see us improve this integration, simply open a new feature request on Github.