Overview

Opik helps you easily log, visualize, and evaluate everything from raw LLM calls to advanced RAG pipelines and complex agentic systems through a robust set of integrations with popular frameworks and tools.

IntegrationDescriptionDocumentationTry in Colab
OpenAILog traces for all OpenAI LLM callsDocumentationOpen Quickstart In Colab
OpenAI AgentsLog traces for OpenAI agents callsDocumentation
Google ADKLog traces for Google ADK agentsDocumentation
OpenRouterLog traces for all OpenRouter LLM calls using OpenAI SDKDocumentation
LiteLLMCall any LLM model using the OpenAI formatDocumentationOpen Quickstart In Colab
LangChainLog traces for all LangChain LLM callsDocumentationOpen Quickstart In Colab
HaystackLog traces for all Haystack pipelinesDocumentationOpen Quickstart In Colab
aisuiteLog traces for all aisuite LLM callsDocumentationOpen Quickstart In Colab
AutogenLog traces for all autogen LLM callsDocumentation
AG2Log traces for all AG2 LLM callsDocumentation
AgnoLog traces for all Agno agent orchestration framework callsDocumentation
AnthropicLog traces for all Anthropic LLM callsDocumentationOpen Quickstart In Colab
BedrockLog traces for all AWS Bedrock LLM callsDocumentationOpen Quickstart In Colab
CrewAILog traces for all CrewAI LLM callsDocumentationOpen Quickstart In Colab
DeepSeekLog traces for all LLM calls made with DeepSeekDocumentation
DifyLog traces and LLM calls for your Dify AppsDocumentation
DSPyLog traces for all DSPy runsDocumentationOpen Quickstart In Colab
GeminiLog traces for all Gemini LLM callsDocumentationOpen Quickstart In Colab
GroqLog traces for all Groq LLM callsDocumentationOpen Quickstart In Colab
GuardrailsLog traces for all Guardrails validationsDocumentationOpen Quickstart In Colab
InstructorLog traces for LLM calls made with InstructorDocumentationOpen Quickstart In Colab
LangChainJSLog traces for all LangChainJS executionsDocumentation
LangGraphLog traces for all LangGraph executionsDocumentationOpen Quickstart In Colab
LlamaIndexLog traces for all LlamaIndex LLM callsDocumentationOpen Quickstart In Colab
OllamaLog traces for all Ollama LLM callsDocumentationOpen Quickstart In Colab
PredibaseFine-tune and serve open-source LLMsDocumentationOpen Quickstart In Colab
Pydantic AIPydanticAI is a Python agent framework designed to build production appsDocumentationOpen Quickstart In Colab
RagasEvaluation framework for your Retrieval Augmented Generation (RAG) pipelinesDocumentationOpen Quickstart In Colab
SmolagentsAgent Framework built by the HuggingFace teamDocumentationOpen Quickstart In Colab
Strands AgentLog traces for all Strands Agent LLM callsDocumentation
Vercel AI SDKLog traces for all Vercel AI SDK LLM callsDocumentation
watsonxLog traces for all watsonx LLM callsDocumentationOpen Quickstart In Colab

If you would like to see more integrations, please open an issue on our GitHub repository.