AG2

AG2 is an open-source programming framework for building AI agents and facilitating cooperation among multiple agents to solve tasks. AG2 aims to streamline the development and research of agentic AI. It offers features such as agents capable of interacting with each other, facilitates the use of various large language models (LLMs) and tool use support, autonomous and human-in-the-loop workflows, and multi-agent conversation patterns.

Opik’s integration with AG2 relies on OpenTelemetry. You can learn more about Opik’s OpenTelemetry features in our get started guide.

Getting started

To use the AG2 integration with Opik, you will need to have the following packages installed:

$pip install -U "ag2[openai]" opik opentelemetry-sdk opentelemetry-instrumentation-openai opentelemetry-instrumentation-threading opentelemetry-exporter-otlp

In addition, you will need to set the following environment variables to configure the OpenTelemetry integration:

If you are using Opik Cloud, you will need to set the following environment variables:

$export OTEL_EXPORTER_OTLP_ENDPOINT=https://www.comet.com/opik/api/v1/private/otel
>export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default'

To log the traces to a specific project, you can add the projectName parameter to the OTEL_EXPORTER_OTLP_HEADERS environment variable:

$export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default,projectName=<your-project-name>'

You can also update the Comet-Workspace parameter to a different value if you would like to log the data to a different workspace.

Using Opik with AG2

The example below shows how to use the AG2 integration with Opik:

{pytest_codeblocks_skip=true}
1## First we will configure the OpenTelemetry
2from opentelemetry import trace
3from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
4from opentelemetry.sdk.resources import Resource
5from opentelemetry.sdk.trace import TracerProvider
6from opentelemetry.sdk.trace.export import BatchSpanProcessor
7from opentelemetry.instrumentation.openai import OpenAIInstrumentor
8from opentelemetry.instrumentation.threading import ThreadingInstrumentor
9
10
11def setup_telemetry():
12 """Configure OpenTelemetry with HTTP exporter"""
13 # Create a resource with service name and other metadata
14 resource = Resource.create(
15 {
16 "service.name": "ag2-demo",
17 "service.version": "1.0.0",
18 "deployment.environment": "development",
19 }
20 )
21
22 # Create TracerProvider with the resource
23 provider = TracerProvider(resource=resource)
24
25 # Create BatchSpanProcessor with OTLPSpanExporter
26 processor = BatchSpanProcessor(OTLPSpanExporter())
27 provider.add_span_processor(processor)
28
29 # Set the TracerProvider
30 trace.set_tracer_provider(provider)
31
32 tracer = trace.get_tracer(__name__)
33
34 # Instrument OpenAI calls
35 OpenAIInstrumentor().instrument(tracer_provider=provider)
36
37 # AG2 calls OpenAI in background threads, propagate the context so all spans ends up in the same trace
38 ThreadingInstrumentor().instrument()
39
40 return tracer, provider
41
42
43# 1. Import our agent class
44from autogen import ConversableAgent, LLMConfig
45
46# 2. Define our LLM configuration for OpenAI's GPT-4o mini
47# uses the OPENAI_API_KEY environment variable
48llm_config = LLMConfig(api_type="openai", model="gpt-4o-mini")
49
50# 3. Create our LLM agent within the parent span context
51with llm_config:
52 my_agent = ConversableAgent(
53 name="helpful_agent",
54 system_message="You are a poetic AI assistant, respond in rhyme.",
55 )
56
57
58def main(message):
59 response = my_agent.run(message=message, max_turns=2, user_input=True)
60
61 # 5. Iterate through the chat automatically with console output
62 response.process()
63
64 # 6. Print the chat
65 print(response.messages)
66
67 return response.messages
68
69
70if __name__ == "__main__":
71 tracer, provider = setup_telemetry()
72
73 # 4. Run the agent with a prompt
74 with tracer.start_as_current_span(my_agent.name) as agent_span:
75 message = "In one sentence, what's the big deal about AI?"
76
77 agent_span.set_attribute("input", message) # Manually log the question
78
79 response = main(message)
80
81 # Manually log the response
82 agent_span.set_attribute("output", response)
83
84 # Force flush all spans to ensure they are exported
85 provider = trace.get_tracer_provider()
86 provider.force_flush()

Once the integration is set-up, you will see the trace in Opik: