Observability for Google Gemini with Opik

Gemini is a family of multimodal large language models developed by Google DeepMind.

VertexAI Support

Opik also supports Google VertexAI, Google’s fully-managed AI development platform that provides access to Gemini models through the google-genai package. When using VertexAI, you can leverage the same track_genai wrapper with the google-genai client configured for VertexAI, allowing you to trace and monitor your Gemini model calls whether you’re using the direct Google AI API or through VertexAI’s enterprise platform.

Account Setup

Comet provides a hosted version of the Opik platform, simply create an account and grab your API Key.

You can also run the Opik platform locally, see the installation guide for more information.

Getting Started

Installation

First, ensure you have both opik and google-genai packages installed:

$pip install opik google-genai

Configuring Opik

Configure the Opik Python SDK for your deployment type. See the Python SDK Configuration guide for detailed instructions on:

  • CLI configuration: opik configure
  • Code configuration: opik.configure()
  • Self-hosted vs Cloud vs Enterprise setup
  • Configuration files and environment variables

Configuring Gemini

In order to configure Gemini, you will need to have your Gemini API Key. See the following documentation page how to retrieve it.

You can set it as an environment variable:

{pytest_codeblocks_skip=true}
$export GOOGLE_API_KEY="YOUR_API_KEY"

Or set it programmatically:

1import os
2import getpass
3
4if "GOOGLE_API_KEY" not in os.environ:
5 os.environ["GOOGLE_API_KEY"] = getpass.getpass("Enter your Gemini API key: ")

Logging LLM calls

In order to log the LLM calls to Opik, you will need to wrap the Gemini client with track_genai. When making calls with that wrapped client, all calls will be logged to Opik:

1from google import genai
2from opik.integrations.genai import track_genai
3
4os.environ["OPIK_PROJECT_NAME"] = "gemini-integration-demo"
5
6client = genai.Client()
7gemini_client = track_genai(client)
8
9prompt = """
10Write a short two sentence story about Opik.
11"""
12
13response = gemini_client.models.generate_content(
14 model="gemini-2.0-flash-001", contents=prompt
15)
16print(response.text)

Using with VertexAI

To use Opik with VertexAI, configure the google-genai client for VertexAI and wrap it with track_genai:

1from google import genai
2from opik.integrations.genai import track_genai
3
4# Configure for VertexAI
5PROJECT_ID = "your-project-id"
6LOCATION = "us-central1"
7
8client = genai.Client(vertexai=True, project=PROJECT_ID, location=LOCATION)
9vertexai_client = track_genai(client)
10
11# Set project name for organization
12os.environ["OPIK_PROJECT_NAME"] = "vertexai-integration-demo"
13
14# Use the wrapped client
15response = vertexai_client.models.generate_content(
16 model="gemini-2.0-flash-001",
17 contents="Write a short story about AI observability."
18)
19print(response.text)

Advanced Usage

Using with the @track decorator

If you have multiple steps in your LLM pipeline, you can use the @track decorator to log the traces for each step. If Gemini is called within one of these steps, the LLM call will be associated with that corresponding step:

1from opik import track
2
3@track
4def generate_story(prompt):
5 response = gemini_client.models.generate_content(
6 model="gemini-2.0-flash-001", contents=prompt
7 )
8 return response.text
9
10@track
11def generate_topic():
12 prompt = "Generate a topic for a story about Opik."
13 response = gemini_client.models.generate_content(
14 model="gemini-2.0-flash-001", contents=prompt
15 )
16 return response.text
17
18@track
19def generate_opik_story():
20 topic = generate_topic()
21 story = generate_story(topic)
22 return story
23
24# Execute the multi-step pipeline
25generate_opik_story()

The trace can now be viewed in the UI with hierarchical spans showing the relationship between different steps:

Cost Tracking

The track_genai wrapper automatically tracks token usage and cost for all supported Google AI models.

Cost information is automatically captured and displayed in the Opik UI, including:

  • Token usage details
  • Cost per request based on Google AI pricing
  • Total trace cost

View the complete list of supported models and providers on the Supported Models page.