"Get Started with OpenTelemetry"

Describes how to send data to Opik using OpenTelemetry

Opik provides native support for OpenTelemetry (OTel), allowing you to instrument your ML/AI applications with distributed tracing. This guide will show you how to directly integrate OpenTelemetry SDKs with Opik.

OpenTelemetry integration in Opik currently supports HTTP transport. We’re actively working on expanding the feature set - stay tuned for updates!

OpenTelemetry Endpoint Configuration

Base Endpoint

To start sending traces to Opik, configure your OpenTelemetry exporter with one of these endpoints:

$export OTEL_EXPORTER_OTLP_ENDPOINT="https://www.comet.com/opik/api/v1/private/otel"
>export OTEL_EXPORTER_OTLP_HEADERS="Authorization=<your-api-key>,projectName=<your-project-name>,Comet-Workspace=<your-workspace-name>"

Signal-Specific Endpoint

If your OpenTelemetry setup requires signal-specific configuration, you can use the traces endpoint. This is particularly useful when different signals (traces, metrics, logs) need to be sent to different endpoints:

$export OTEL_EXPORTER_OTLP_TRACES_ENDPOINT="http://<YOUR-OPIK-INSTANCE>/api/v1/private/otel/v1/traces"

Custom via OpenTelemetry SDKs

You can use any OpenTelemetry SDK to send traces directly to Opik. OpenTelemetry provides SDKs for many languages (C++, .NET, Erlang/Elixir, Go, Java, JavaScript, PHP, Python, Ruby, Rust, Swift). This extends Opik’s language support beyond the official SDKs (Python and TypeScript). For more instructions, visit the OpenTelemetry documentation.

Here’s a Python example showing how to set up OpenTelemetry with Opik:

1from opentelemetry import trace
2from opentelemetry.exporter.otlp.proto.http.trace_exporter import (
3 OTLPSpanExporter
4)
5from opentelemetry.sdk.trace import TracerProvider
6from opentelemetry.sdk.trace.export import BatchSpanProcessor
7
8# For Comet-hosted installations
9OPIK_ENDPOINT = "https://<COMET-SERVER>/api/v1/private/otel/v1/traces"
10API_KEY = "<your-api-key>"
11PROJECT_NAME = "<your-project-name>"
12WORKSPACE_NAME = "<your-workspace-name>"
13
14# Initialize the trace provider
15provider = TracerProvider()
16processor = BatchSpanProcessor(
17 OTLPSpanExporter(
18 endpoint=OPIK_ENDPOINT,
19 headers={
20 "Authorization": API_KEY,
21 "projectName": PROJECT_NAME,
22 "Comet-Workspace": WORKSPACE_NAME
23 }
24 )
25)
26provider.add_span_processor(processor)
27trace.set_tracer_provider(provider)

In order to track OpenAI calls, you need to use the OpenTelemetry instrumentations for OpenAI:

$pip install opentelemetry-instrumentation-openai

And then instrument your OpenAI client:

1from opentelemetry.instrumentation.openai import OpenAIInstrumentor
2
3OpenAIInstrumentor().instrument()

Make sure to import the http trace exporter (opentelemetry.exporter.otlp.proto.http.trace_exporter), if you use the GRPC exporter you will face errors.