Observability for Google Gemini (TypeScript) with Opik

Opik provides seamless integration with the Google Generative AI Node.js SDK (@google/genai) through the opik-gemini package, allowing you to trace, monitor, and debug your Gemini API calls.

Features

  • Comprehensive Tracing: Automatically trace Gemini API calls, including text generation, chat, and multimodal interactions
  • Hierarchical Visualization: View your Gemini requests as structured traces with parent-child relationships
  • Detailed Metadata Capture: Record model names, prompts, completions, token usage, and custom metadata
  • Error Handling: Capture and visualize errors encountered during Gemini API interactions
  • Custom Tagging: Add custom tags to organize and filter your traces
  • Streaming Support: Full support for streamed responses with token-by-token tracing
  • VertexAI Support: Works with both Google AI Studio and Vertex AI endpoints

Gemini TypeScript Integration

Installation

Option 1: Using npm

$npm install opik-gemini @google/genai

Option 2: Using yarn

$yarn add opik-gemini @google/genai

Requirements

  • Node.js ≥ 18
  • Google Generative AI SDK (@google/genai ≥ 1.0.0)
  • Opik SDK (automatically installed as a dependency)

Note: The official Google GenAI SDK package is @google/genai (not @google/generative-ai). This is Google Deepmind’s unified SDK for both Gemini Developer API and Vertex AI.

Basic Usage

Using with Google Generative AI Client

To trace your Gemini API calls, you need to wrap your Gemini client instance with the trackGemini function:

1import { GoogleGenAI } from "@google/genai";
2import { trackGemini } from "opik-gemini";
3
4// Initialize the original Gemini client
5const genAI = new GoogleGenAI({
6 apiKey: process.env.GEMINI_API_KEY,
7});
8
9// Wrap the client with Opik tracking
10const trackedGenAI = trackGemini(genAI);
11
12// Generate content
13const response = await trackedGenAI.models.generateContent({
14 model: "gemini-2.0-flash-001",
15 contents: "Hello, how can you help me today?",
16});
17
18console.log(response.text);
19
20// Ensure all traces are sent before your app terminates
21await trackedGenAI.flush();

Using with Streaming Responses

The integration fully supports Gemini’s streaming responses:

1import { GoogleGenAI } from "@google/genai";
2import { trackGemini } from "opik-gemini";
3
4const genAI = new GoogleGenAI({ apiKey: process.env.GEMINI_API_KEY });
5const trackedGenAI = trackGemini(genAI);
6
7async function streamingExample() {
8 // Create a streaming generation
9 const response = await trackedGenAI.models.generateContentStream({
10 model: "gemini-2.0-flash-001",
11 contents: "Write a short story about AI observability",
12 });
13
14 // Process the stream
15 let streamedContent = "";
16 for await (const chunk of response) {
17 const chunkText = chunk.text;
18 if (chunkText) {
19 process.stdout.write(chunkText);
20 streamedContent += chunkText;
21 }
22 }
23
24 console.log("\nStreaming complete!");
25
26 // Don't forget to flush when done
27 await trackedGenAI.flush();
28}
29
30streamingExample();

Advanced Configuration

The trackGemini function accepts an optional configuration object to customize the integration:

1import { GoogleGenAI } from "@google/genai";
2import { trackGemini } from "opik-gemini";
3import { Opik } from "opik";
4
5// Optional: Create a custom Opik client
6const customOpikClient = new Opik({
7 apiKey: "YOUR_OPIK_API_KEY", // If not using environment variables
8 projectName: "gemini-integration-project",
9});
10
11const existingOpikTrace = customOpikClient.trace({
12 name: `Trace`,
13 input: {
14 prompt: `Hello, world!`,
15 },
16 output: {
17 response: `Hello, world!`,
18 },
19});
20
21const genAI = new GoogleGenAI({
22 apiKey: process.env.GEMINI_API_KEY,
23});
24
25// Configure the tracked client with options
26const trackedGenAI = trackGemini(genAI, {
27 // Optional array of tags to apply to all traces
28 traceMetadata: {
29 tags: ["gemini", "production", "user-query"],
30
31 // Optional metadata to include with all traces
32 environment: "production",
33 version: "1.2.3",
34 component: "story-generator",
35 },
36
37 // Optional custom name for the generation/trace
38 generationName: "StoryGenerationService",
39
40 // Optional pre-configured Opik client
41 // If not provided, a singleton instance will be used
42 client: customOpikClient,
43
44 // Optional parent trace for hierarchical relationships
45 parent: existingOpikTrace,
46});
47
48// Use the tracked client with your configured options
49const response = await trackedGenAI.models.generateContent({
50 model: "gemini-2.0-flash-001",
51 contents: "Generate a creative story",
52});
53
54console.log(response.text);
55
56// Close the existing trace
57existingOpikTrace.end();
58
59// Flush before your application exits
60await trackedGenAI.flush();

Using with VertexAI

The integration also supports Google’s VertexAI platform. Simply configure your Gemini client for VertexAI and wrap it with trackGemini:

1import { GoogleGenAI } from "@google/genai";
2import { trackGemini } from "opik-gemini";
3
4// Configure for VertexAI
5const genAI = new GoogleGenAI({
6 vertexai: true,
7 project: "your-project-id",
8 location: "us-central1",
9});
10
11const trackedGenAI = trackGemini(genAI);
12
13const response = await trackedGenAI.models.generateContent({
14 model: "gemini-2.0-flash-001",
15 contents: "Write a short story about AI observability",
16});
17
18console.log(response.text);
19
20// Flush before your application exits
21await trackedGenAI.flush();

Chat Conversations

Track multi-turn chat conversations with Gemini:

1import { GoogleGenAI } from "@google/genai";
2import { trackGemini } from "opik-gemini";
3
4const genAI = new GoogleGenAI({ apiKey: process.env.GEMINI_API_KEY });
5const trackedGenAI = trackGemini(genAI);
6
7async function chatExample() {
8 // Multi-turn conversation using generateContent with history
9 const response = await trackedGenAI.models.generateContent({
10 model: "gemini-2.0-flash-001",
11 contents: [
12 {
13 role: "user",
14 parts: [{ text: "Hello, I want to learn about AI observability." }],
15 },
16 {
17 role: "model",
18 parts: [
19 {
20 text: "Great! AI observability helps track and debug LLM applications.",
21 },
22 ],
23 },
24 {
25 role: "user",
26 parts: [{ text: "What are the key benefits of using Opik?" }],
27 },
28 ],
29 });
30
31 console.log(response.text);
32
33 await trackedGenAI.flush();
34}
35
36chatExample();

Troubleshooting

Missing Traces: Ensure your Gemini and Opik API keys are correct and that you’re calling await trackedGenAI.flush() before your application exits.

Incomplete Data: For streaming responses, make sure you’re consuming the entire stream before ending your application.

Hierarchical Traces: To create proper parent-child relationships, use the parent option in the configuration when you want Gemini calls to be children of another trace.

Performance Impact: The Opik integration adds minimal overhead to your Gemini API calls.

VertexAI Authentication: When using VertexAI, ensure you have properly configured your Google Cloud project credentials.