Observability for Haystack with Opik

Haystack is an open-source framework for building production-ready LLM applications, retrieval-augmented generative pipelines and state-of-the-art search systems that work intelligently over large document collections.

In this guide, we will showcase how to integrate Opik with Haystack so that all the Haystack calls are logged as traces in Opik.

Account Setup

Comet provides a hosted version of the Opik platform, simply create an account and grab your API Key.

You can also run the Opik platform locally, see the installation guide for more information.

Opik integrates with Haystack to log traces for all Haystack pipelines.

Getting Started

Installation

First, ensure you have both opik and haystack-ai installed:

$pip install opik haystack-ai

Configuring Opik

Configure the Opik Python SDK for your deployment type. See the Python SDK Configuration guide for detailed instructions on:

  • CLI configuration: opik configure
  • Code configuration: opik.configure()
  • Self-hosted vs Cloud vs Enterprise setup
  • Configuration files and environment variables

Configuring Haystack

In order to use Haystack, you will need to configure the OpenAI API Key. If you are using any other providers, you can replace this with the required API key. You can find or create your OpenAI API Key in this page.

You can set it as an environment variable:

$export OPENAI_API_KEY="YOUR_API_KEY"

Or set it programmatically:

1import os
2import getpass
3
4if "OPENAI_API_KEY" not in os.environ:
5 os.environ["OPENAI_API_KEY"] = getpass.getpass("Enter your OpenAI API key: ")

Creating the Haystack pipeline

In this example, we will create a simple pipeline that uses a prompt template to translate text to German.

To enable Opik tracing, we will:

  1. Enable content tracing in Haystack by setting the environment variable HAYSTACK_CONTENT_TRACING_ENABLED=true
  2. Add the OpikConnector component to the pipeline

Note: The OpikConnector component is a special component that will automatically log the traces of the pipeline as Opik traces, it should not be connected to any other component.

1import os
2
3os.environ["HAYSTACK_CONTENT_TRACING_ENABLED"] = "true"
4
5from haystack import Pipeline
6from haystack.components.builders import ChatPromptBuilder
7from haystack.components.generators.chat import OpenAIChatGenerator
8from haystack.dataclasses import ChatMessage
9
10from opik.integrations.haystack import OpikConnector
11
12pipe = Pipeline()
13
14# Add the OpikConnector component to the pipeline
15pipe.add_component("tracer", OpikConnector("Chat example"))
16
17# Continue building the pipeline
18pipe.add_component("prompt_builder", ChatPromptBuilder())
19pipe.add_component("llm", OpenAIChatGenerator(model="gpt-3.5-turbo"))
20
21pipe.connect("prompt_builder.prompt", "llm.messages")
22
23messages = [
24 ChatMessage.from_system(
25 "Always respond in German even if some input data is in other languages."
26 ),
27 ChatMessage.from_user("Tell me about {{location}}"),
28]
29
30response = pipe.run(
31 data={
32 "prompt_builder": {
33 "template_variables": {"location": "Berlin"},
34 "template": messages,
35 }
36 }
37)
38
39trace_id = response["tracer"]["trace_id"]
40print(f"Trace ID: {trace_id}")
41print(response["llm"]["replies"][0])

The trace is now logged to the Opik platform:

Cost Tracking

The OpikConnector automatically tracks token usage and cost for all supported LLM models used within Haystack pipelines.

Cost information is automatically captured and displayed in the Opik UI, including:

  • Token usage details
  • Cost per request based on model pricing
  • Total trace cost

View the complete list of supported models and providers on the Supported Models page.

In order to ensure the traces are correctly logged, make sure you set the environment variable HAYSTACK_CONTENT_TRACING_ENABLED to true before running the pipeline.

Advanced usage

Ensuring the trace is logged

By default the OpikConnector will flush the trace to the Opik platform after each component in a thread blocking way. As a result, you may disable flushing the data after each component by setting the HAYSTACK_OPIK_ENFORCE_FLUSH environent variable to false.

Caution: Disabling this feature may result in data loss if the program crashes before the data is sent to Opik. Make sure you will call the flush() method explicitly before the program exits:

1from haystack.tracing import tracer
2
3tracer.actual_tracer.flush()

Getting the trace ID

If you would like to log additional information to the trace you will need to get the trace ID. You can do this by the tracer key in the response of the pipeline:

1response = pipe.run(
2 data={
3 "prompt_builder": {
4 "template_variables": {"location": "Berlin"},
5 "template": messages,
6 }
7 }
8)
9
10trace_id = response["tracer"]["trace_id"]
11print(f"Trace ID: {trace_id}")

Updating logged traces

The OpikConnector returns the logged trace ID in the pipeline run response. You can use this ID to update the trace with feedback scores or other metadata:

1import opik
2
3response = pipe.run(
4 data={
5 "prompt_builder": {
6 "template_variables": {"location": "Berlin"},
7 "template": messages,
8 }
9 }
10)
11
12# Get the trace ID from the pipeline run response
13trace_id = response["tracer"]["trace_id"]
14
15# Log the feedback score
16opik_client = opik.Opik()
17opik_client.log_traces_feedback_scores([
18 {"id": trace_id, "name": "user-feedback", "value": 0.5}
19])