Observability for OpenAI Codex with Opik
OpenAI Codex supports opt-in OpenTelemetry export through Codex configuration files.
When this guide applies
Use this guide if you run Codex (CLI/IDE/app) and want its OTEL trace exporter to send telemetry to Opik.
The block structure below follows the current Codex runtime config shape used in local config.toml ([otel.trace_exporter.otlp-http]).
Where to configure Codex
Codex reads configuration from:
- user config:
~/.codex/config.toml - project config:
.codex/config.toml
See Codex config basics.
Opik OTLP trace endpoint modes
For Opik OTEL endpoint behavior, see Opik OpenTelemetry overview.
Opik Cloud
Enterprise deployment
Self-hosted instance
Required headers:
AuthorizationComet-Workspace
Optional headers:
projectName(recommended)
Example intent and minimal valid setup
Intent: Route Codex OTEL trace export to Opik with project/workspace attribution.
Applies when: You have enabled Codex OTEL export and selected OTLP/HTTP exporter in config.
Required fields:
trace_exporter = "otlp-http"under[otel]otel.trace_exporterexporter block (otlp-http)endpointprotocol(binaryorjson, binary recommended)
Optional fields:
headers(projectNamestrongly recommended)otel.environmentotel.log_user_prompt(keepfalseunless policy allows prompt export)
Minimal valid config:
Validation
- Run a Codex session after updating
config.toml. - Confirm OTLP HTTP requests are sent to
/otel/v1/traces. - Verify traces appear in the expected Opik workspace/project.
Notes
- Codex telemetry export is opt-in.
- Keep
log_user_prompt = falseunless your policy explicitly allows prompt text export. - If your Codex build uses a different exporter key path, align with your installed version’s config reference.