"Get Started with OpenTelemetry"
Opik provides native support for OpenTelemetry (OTel), allowing you to instrument your ML/AI applications with distributed tracing. This guide will show you how to directly integrate OpenTelemetry SDKs with Opik.
OpenTelemetry integration in Opik currently supports HTTP transport. We’re actively working on expanding the feature set - stay tuned for updates!
OpenTelemetry Endpoint Configuration
Base Endpoint
To start sending traces to Opik, configure your OpenTelemetry exporter with one of these endpoints:
Opik Cloud
Self-hosted deployment
Enterprise deployment
Signal-Specific Endpoint
If your OpenTelemetry setup requires signal-specific configuration, you can use the traces endpoint. This is particularly useful when different signals (traces, metrics, logs) need to be sent to different endpoints:
Custom via OpenTelemetry SDKs
You can use any OpenTelemetry SDK to send traces directly to Opik. OpenTelemetry provides SDKs for many languages (C++, .NET, Erlang/Elixir, Go, Java, JavaScript, PHP, Python, Ruby, Rust, Swift). This extends Opik’s language support beyond the official SDKs (Python and TypeScript). For more instructions, visit the OpenTelemetry documentation.
Here’s a Python example showing how to set up OpenTelemetry with Opik:
In order to track OpenAI calls, you need to use the OpenTelemetry instrumentations for OpenAI:
And then instrument your OpenAI client:
Make sure to import the http
trace exporter (opentelemetry.exporter.otlp.proto.http.trace_exporter
), if you use
the GRPC exporter you will face errors.