OpenAI Integration

Opik provides seamless integration with the official OpenAI Node.js SDK through the opik-openai package, allowing you to trace, monitor, and debug your OpenAI API calls.

Features

  • Comprehensive Tracing: Automatically trace OpenAI API calls, including chat completions, embeddings, images, and more
  • Hierarchical Visualization: View your OpenAI requests as structured traces with parent-child relationships
  • Detailed Metadata Capture: Record model names, prompts, completions, token usage, and custom metadata
  • Error Handling: Capture and visualize errors encountered during OpenAI API interactions
  • Custom Tagging: Add custom tags to organize and filter your traces
  • Streaming Support: Full support for streamed responses with token-by-token tracing

Installation

Option 1: Using npm

$npm install opik-openai openai

Option 2: Using yarn

$yarn add opik-openai openai

Requirements

  • Node.js ≥ 18
  • OpenAI SDK (openai ≥ 4.0.0)
  • Opik SDK (automatically installed as a dependency)

Basic Usage

Using with OpenAI Client

To trace your OpenAI API calls, you need to wrap your OpenAI client instance with the trackOpenAI function:

1import OpenAI from "openai";
2import { trackOpenAI } from "opik-openai";
3
4// Initialize the original OpenAI client
5const openai = new OpenAI({
6 apiKey: process.env.OPENAI_API_KEY,
7});
8
9// Wrap the client with Opik tracking
10const trackedOpenAI = trackOpenAI(openai);
11
12// Use the tracked client just like the original
13const completion = await trackedOpenAI.chat.completions.create({
14 model: "gpt-4",
15 messages: [{ role: "user", content: "Hello, how can you help me today?" }],
16});
17
18console.log(completion.choices[0].message.content);
19
20// Ensure all traces are sent before your app terminates
21await trackedOpenAI.flush();

Using with Streaming Responses

The integration fully supports OpenAI’s streaming responses:

1import OpenAI from "openai";
2import { trackOpenAI } from "opik-openai";
3
4const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });
5const trackedOpenAI = trackOpenAI(openai);
6
7async function streamingExample() {
8 // Create a streaming chat completion
9 const stream = await trackedOpenAI.chat.completions.create({
10 model: "gpt-3.5-turbo",
11 messages: [{ role: "user", content: "What is streaming?" }],
12 stream: true,
13 // Include usage in the stream
14 stream_options: {
15 include_usage: true,
16 },
17 });
18
19 // Process the stream
20 let streamedContent = "";
21 for await (const chunk of stream) {
22 const content = chunk.choices[0]?.delta?.content || "";
23 process.stdout.write(content);
24 streamedContent += content;
25 }
26
27 console.log("\nStreaming complete!");
28
29 // Don't forget to flush when done
30 await trackedOpenAI.flush();
31}
32
33streamingExample();

Advanced Configuration

The trackOpenAI function accepts an optional configuration object to customize the integration:

1import OpenAI from "openai";
2import { trackOpenAI } from "opik-openai";
3import { Opik } from "opik";
4
5// Optional: Create a custom Opik client
6const customOpikClient = new Opik({
7 apiKey: "YOUR_OPIK_API_KEY", // If not using environment variables
8 projectName: "openai-integration-project",
9});
10
11const existingOpikTrace = customOpikClient.trace({
12 name: `Trace`,
13 input: {
14 prompt: `Hello, world!`,
15 },
16 output: {
17 response: `Hello, world!`,
18 },
19});
20
21const openai = new OpenAI({
22 apiKey: process.env.OPENAI_API_KEY,
23});
24
25// Configure the tracked client with options
26const trackedOpenAI = trackOpenAI(openai, {
27 // Optional array of tags to apply to all traces
28 traceMetadata: {
29 tags: ["openai", "production", "user-query"],
30
31 // Optional metadata to include with all traces
32 environment: "production",
33 version: "1.2.3",
34 component: "recommendation-engine",
35 },
36
37 // Optional custom name for the generation/trace
38 generationName: "ProductRecommendationService",
39
40 // Optional pre-configured Opik client
41 // If not provided, a singleton instance will be used
42 client: customOpikClient,
43
44 // Optional parent trace for hierarchical relationships
45 parent: existingOpikTrace,
46});
47
48// Use the tracked client with your configured options
49const response = await trackedOpenAI.embeddings.create({
50 model: "text-embedding-ada-002",
51 input: "This is a sample text for embeddings",
52});
53
54// Close the existing trace
55existingOpikTrace.end();
56
57// Flush before your application exits
58await trackedOpenAI.flush();

Troubleshooting

Missing Traces: Ensure your OpenAI and Opik API keys are correct and that you’re calling await trackedOpenAI.flush() before your application exits.

Incomplete Data: For streaming responses, make sure you’re consuming the entire stream before ending your application.

Hierarchical Traces: To create proper parent-child relationships, use the parent option in the configuration when you want OpenAI calls to be children of another trace.

Performance Impact: The Opik integration adds minimal overhead to your OpenAI API calls.