Observability for Opik LLM Gateway with Opik
The Opik LLM Gateway is a light-weight proxy server that can be used to query different LLM APIs using the OpenAI format. It’s designed for development and testing purposes and provides a centralized way to access multiple LLM providers through a single endpoint.
Gateway Overview
An LLM gateway is a proxy server that forwards requests to an LLM API and returns the response. This is useful when you want to centralize access to LLM providers or query multiple LLM providers from a single endpoint using a consistent request and response format.
The Opik LLM Gateway supports the OpenAI-compatible API format, making it easy to integrate with existing applications that use OpenAI’s API structure.
The Opik LLM Gateway is currently in beta and is subject to change.
Account Setup
Comet provides a hosted version of the Opik platform. Simply create an account and grab your API Key.
You can also run the Opik platform locally, see the installation guide for more information.
Getting Started
Configuring LLM Provider Credentials
In order to use the Opik LLM Gateway, you will first need to configure your LLM provider credentials in the Opik UI. Once this is done, you can use the Opik gateway to query your LLM provider.
Using the Opik LLM Gateway
Once your LLM provider credentials are configured, you can make requests to the Opik LLM Gateway endpoint:
Opik Cloud
Opik self-hosted
Request Parameters
The Opik LLM Gateway accepts the following parameters in the request body:
model: The LLM model identifier (configured in Opik UI)messages: Array of message objects withroleandcontentfieldstemperature: Sampling temperature (0-2)stream: Boolean to enable streaming responsesmax_tokens: Maximum number of tokens to generate
Response Format
The gateway returns responses in the OpenAI-compatible format, making it easy to integrate with existing applications.
Further Improvements
If you have suggestions for improving the Opik LLM Gateway, please let us know by opening an issue on GitHub.