Observability for CrewAI with Opik

CrewAI is a cutting-edge framework for orchestrating autonomous AI agents.

CrewAI enables you to create AI teams where each agent has specific roles, tools, and goals, working together to accomplish complex tasks.

Think of it as assembling your dream team - each member (agent) brings unique skills and expertise, collaborating seamlessly to achieve your objectives.

Opik integrates with CrewAI to log traces for all CrewAI activity, including both classic Crew/Agent/Task pipelines and the new CrewAI Flows API.

Account Setup

Comet provides a hosted version of the Opik platform, simply create an account and grab your API Key.

You can also run the Opik platform locally, see the installation guide for more information.

Getting Started

Installation

First, ensure you have both opik and crewai installed:

$pip install opik crewai crewai-tools

Configuring Opik

Configure the Opik Python SDK for your deployment type. See the Python SDK Configuration guide for detailed instructions on:

  • CLI configuration: opik configure
  • Code configuration: opik.configure()
  • Self-hosted vs Cloud vs Enterprise setup
  • Configuration files and environment variables

Configuring CrewAI

In order to configure CrewAI, you will need to have your LLM provider API key. For this example, we’ll use OpenAI. You can find or create your OpenAI API Key in this page.

You can set it as an environment variable:

$export OPENAI_API_KEY="YOUR_API_KEY"

Or set it programmatically:

1import os
2import getpass
3
4if "OPENAI_API_KEY" not in os.environ:
5 os.environ["OPENAI_API_KEY"] = getpass.getpass("Enter your OpenAI API key: ")

Logging CrewAI calls

To log a CrewAI pipeline run, you can use the track_crewai function. This will log each CrewAI call to Opik, including LLM calls made by your agents.

CrewAI v1.0.0+ requires the crew parameter: To ensure LLM calls are properly logged in CrewAI v1.0.0 and later, you must pass your Crew instance to track_crewai(crew=your_crew). This is required because CrewAI v1.0.0+ changed how LLM providers are handled internally.

For CrewAI v0.x, the crew parameter is optional as LLM tracking works through LiteLLM delegation.

Creating a CrewAI Project

The first step is to create our project. We will use an example from CrewAI’s documentation:

1from crewai import Agent, Crew, Task, Process
2
3class YourCrewName:
4 def agent_one(self) -> Agent:
5 return Agent(
6 role="Data Analyst",
7 goal="Analyze data trends in the market",
8 backstory="An experienced data analyst with a background in economics",
9 verbose=True,
10 )
11
12 def agent_two(self) -> Agent:
13 return Agent(
14 role="Market Researcher",
15 goal="Gather information on market dynamics",
16 backstory="A diligent researcher with a keen eye for detail",
17 verbose=True,
18 )
19
20 def task_one(self) -> Task:
21 return Task(
22 name="Collect Data Task",
23 description="Collect recent market data and identify trends.",
24 expected_output="A report summarizing key trends in the market.",
25 agent=self.agent_one(),
26 )
27
28 def task_two(self) -> Task:
29 return Task(
30 name="Market Research Task",
31 description="Research factors affecting market dynamics.",
32 expected_output="An analysis of factors influencing the market.",
33 agent=self.agent_two(),
34 )
35
36 def crew(self) -> Crew:
37 return Crew(
38 agents=[self.agent_one(), self.agent_two()],
39 tasks=[self.task_one(), self.task_two()],
40 process=Process.sequential,
41 verbose=True,
42 )

Running with Opik Tracking

Now we can import Opik’s tracker and run our crew. For CrewAI v1.0.0+, pass the crew instance to track_crewai to ensure LLM calls are logged:

1from opik.integrations.crewai import track_crewai
2
3# Create the crew
4my_crew = YourCrewName().crew()
5
6track_crewai(project_name="crewai-integration-demo", crew=my_crew)
7
8# Run the crew
9result = my_crew.kickoff()
10
11print(result)

Each run will now be logged to the Opik platform, including all agent activities and LLM calls.

Logging CrewAI Flows

Opik also supports the CrewAI Flows API. When you enable tracking with track_crewai, Opik automatically:

  • Tracks Flow.kickoff() and Flow.kickoff_async() as the root span/trace with inputs and outputs
  • Tracks flow step methods decorated with @start and @listen as nested spans
  • Captures any LLM calls (via LiteLLM) within those steps with token usage
  • Flow methods are compatible with other Opik integrations (e.g., OpenAI, Anthropic, LangChain) and the @opik.track decorator. Any spans created inside flow steps are correctly attached to the flow’s span tree.

Example:

1import litellm
2from crewai.flow.flow import Flow, start, listen
3from opik.integrations.crewai import track_crewai
4
5track_crewai(project_name="crewai-integration-demo")
6
7class ExampleFlow(Flow):
8 model = "gpt-4o-mini"
9
10 @start()
11 def generate_city(self):
12 response = litellm.completion(
13 model=self.model,
14 messages=[{"role": "user", "content": "Return the name of a random city."}],
15 )
16 return response["choices"][0]["message"]["content"]
17
18 @listen(generate_city)
19 def generate_fun_fact(self, random_city):
20 response = litellm.completion(
21 model=self.model,
22 messages=[{"role": "user", "content": f"Tell me a fun fact about {random_city}"}],
23 )
24 return response["choices"][0]["message"]["content"]
25
26flow = ExampleFlow()
27result = flow.kickoff()

Cost Tracking

The track_crewai integration automatically tracks token usage and cost for all supported LLM models used during CrewAI agent execution.

Cost information is automatically captured and displayed in the Opik UI, including:

  • Token usage details
  • Cost per request based on model pricing
  • Total trace cost

View the complete list of supported models and providers on the Supported Models page.

Grouping traces into conversational threads using thread_id

Threads in Opik are collections of traces that are grouped together using a unique thread_id.

The thread_id can be passed to the CrewAI crew as a parameter, which will be used to group all traces into a single thread.

1from crewai import Agent, Crew, Task, Process
2from opik.integrations.crewai import track_crewai
3
4# Define your crew (using the example from above)
5my_crew = YourCrewName().crew()
6
7# Enable tracking with the crew instance (required for v1.0.0+)
8track_crewai(project_name="crewai-integration-demo", crew=my_crew)
9
10# Pass thread_id via opik_args
11args_dict = {
12 "trace": {
13 "thread_id": "conversation-2",
14 },
15}
16
17result = my_crew.kickoff(opik_args=args_dict)

More information on logging chat conversations can be found in the Log conversations section.