LLM Model Registry Configuration
The list of LLM models that appear in Opik’s dropdowns (Playground, LLM-as-Judge, Automation Rules, Optimization Studio) is served by the backend from a YAML registry. The registry is composed from up to three sources, merged in this order:
- Classpath defaults —
llm-models-default.yamlshipped inside the backend JAR. Always loaded. This is the source every deployment sees out of the box. - Remote CDN YAML — opt-in. When enabled, the backend fetches a YAML from a URL you configure and refreshes it on a schedule. Self-hosted deployments are not required to use this; it’s primarily for operators who want to pick up new models between Opik releases without redeploying.
- Local override YAML — optional. A YAML file you mount into the backend container; its entries override or extend the defaults and the remote content.
This page describes how to configure these sources for self-hosted deployments.
Environment variables
All configuration is via env vars on the opik-backend container.
YAML schema
Fields:
id(required) — the model identifier used at inference time.qualifiedName(optional) — disambiguates models that exist under multiple providers (e.g. Gemini via Vertex AI vs. the Gemini API directly). Used as the routing key when set.label(optional) — the human-readable name shown in dropdowns. Falls back toidwhen omitted.structuredOutput(optional, defaultfalse) — whether the model supports JSON schema / tool-calling structured output mode.reasoning(optional, defaultfalse) — whether the model is a reasoning model (enforces temperature = 1.0 and unlocks reasoning-effort parameters in the UI).
Merge behavior
Models are keyed by id across every provider. qualifiedName is used for routing lookups (to disambiguate gemini-2.5-pro under the Gemini direct API vs. Vertex AI), but override deduplication always uses id. When a merge happens:
- Add: an
idnot present in lower layers is appended to that provider’s list. - Override: an
idthat matches a lower layer replaces the full definition. Partial overrides are not supported — supply all fields you want on the final model. - Remove: not currently supported. Contact support if you need to hide a default model entirely.
Configuration scenarios
Default behaviour
Leave the defaults in place. The backend serves the classpath llm-models-default.yaml shipped with your Opik release — no outbound traffic, no extra configuration. Upgrade Opik to pick up new models.
Enable the remote CDN fetch (optional)
If you want new models to reach your running deployment between Opik releases — e.g. if you run long-lived stacks on an extended upgrade cadence and want provider-side additions to land automatically — point the backend at a remote YAML:
Comet SaaS uses https://cdn.comet.ml/opik/llm-models-default.yaml, regenerated daily by the Opik sync workflow — you can either mirror that content on your own CDN or point directly at it if your policies allow.
Remote fetch failures are logged but non-fatal: the backend keeps serving the last successful registry (or the classpath defaults if the first fetch fails), so enabling the remote tier never risks losing model routing.
Add a private fine-tuned model (Docker Compose)
Create /etc/opik/my-models-override.yaml on the host:
Mount it into the backend container and set the path:
Add a private fine-tuned model (Kubernetes / Helm)
Create a ConfigMap with your override YAML:
Mount it in the backend Deployment by extending your Helm values:
Verification
After restart, check that your model appears:
The same list appears in the UI dropdowns within seconds of a browser refresh.