Observability for Agent Spec with Opik

Open Agent Specification is a portable configuration language for defining agentic systems (agents, tools, and structured workflows).

Agent Spec Tracing is an extension of Agent Spec that standardizes how agent and flow executions emit traces. This makes it easier to analyze what happened (LLM calls, tool calls, and intermediate steps) across different runtimes and adapters.

Account Setup

Comet provides a hosted version of the Opik platform, simply create an account and grab your API Key.

You can also run the Opik platform locally, see the installation guide for more information.

Getting Started

Installation

To use Agent Spec with Opik, install opik and pyagentspec:

$pip install -U opik pyagentspec opentelemetry-sdk opentelemetry-instrumentation

If you are using the LangGraph adapter (as in the example below), install the LangGraph extra as well:

$pip install -U "pyagentspec[langgraph]"

If you are using another framework, you can install the respective extra for pyagentspec, according to the installation instructions.

Configuring Opik

Configure the Opik Python SDK for your deployment type. See the Python SDK Configuration guide for detailed instructions on:

  • CLI configuration: opik configure
  • Code configuration: opik.configure()
  • Self-hosted vs Cloud vs Enterprise setup
  • Configuration files and environment variables

Configuring your LLM provider

In order to run the example below, you will need to configure your LLM provider API keys. For this example, we’ll use OpenAI. You can find or create your API keys in these pages:

You can set them as environment variables:

$export OPENAI_API_KEY="YOUR_API_KEY"

Or set them programmatically:

1import os
2import getpass
3
4if "OPENAI_API_KEY" not in os.environ:
5 os.environ["OPENAI_API_KEY"] = getpass.getpass("Enter your OpenAI API key: ")

Tracing Agent Spec workflows with Opik

Opik provides an AgentSpecInstrumentor that connects Agent Spec Tracing to Opik. Wrap your Agent Spec runtime execution in the instrumentor context to capture traces.

1import asyncio
2
3from pyagentspec.agent import Agent
4from pyagentspec.llms import OpenAiConfig
5from pyagentspec.property import FloatProperty
6from pyagentspec.tools import ServerTool
7
8
9def build_agentspec_agent() -> Agent:
10 tools = [
11 ServerTool(
12 name="sum",
13 description="Sum two numbers",
14 inputs=[FloatProperty(title="a"), FloatProperty(title="b")],
15 outputs=[FloatProperty(title="result")],
16 ),
17 ServerTool(
18 name="subtract",
19 description="Subtract two numbers",
20 inputs=[FloatProperty(title="a"), FloatProperty(title="b")],
21 outputs=[FloatProperty(title="result")],
22 ),
23 ]
24
25 return Agent(
26 name="calculator_agent",
27 description="An agent that provides assistance with tool use.",
28 llm_config=OpenAiConfig(name="openai-gpt-5-mini", model_id="gpt-5-mini"),
29 system_prompt=(
30 "You are a helpful calculator agent.\n"
31 "Your duty is to compute the result of the given operation using tools, "
32 "and to output the result.\n"
33 "It's important that you reply with the result only.\n"
34 ),
35 tools=tools,
36 )
37
38
39async def main():
40 from opik.integrations.agentspec import AgentSpecInstrumentor
41 from pyagentspec.adapters.langgraph import AgentSpecLoader
42
43 agent = build_agentspec_agent()
44 tool_registry = {
45 "sum": lambda a, b: a + b,
46 "subtract": lambda a, b: a - b,
47 }
48
49 langgraph_agent = AgentSpecLoader(tool_registry=tool_registry).load_component(agent)
50
51 with AgentSpecInstrumentor().instrument_context(
52 project_name="agentspec-demo",
53 mask_sensitive_information=False,
54 ):
55 messages = []
56 while True:
57 user_input = input("USER >>> ")
58 if user_input.lower() in ["exit", "quit"]:
59 break
60 messages.append({"role": "user", "content": user_input})
61 response = langgraph_agent.invoke(
62 input={"messages": messages},
63 config={"configurable": {"thread_id": "1"}},
64 )
65 agent_answer = response["messages"][-1].content.strip()
66 print("AGENT >>>", agent_answer)
67 messages.append({"role": "assistant", "content": agent_answer})
68
69
70if __name__ == "__main__":
71 asyncio.run(main())

Agent Spec traces often include prompts, tool inputs/outputs, and messages. If you need to avoid logging sensitive information, set mask_sensitive_information=True.

Once you run the script and interact with your agent, you can inspect the trace tree in Opik to debug tool usage, LLM generations, and intermediate steps.

Further improvements

If you would like to see us improve this integration, simply open a new feature request on Github.