BeeAI Integration via OpenTelemetry

BeeAI is an agent framework designed to simplify the development of AI agents with a focus on simplicity and performance. It provides a clean API for building agents with built-in support for tool usage, conversation management, and extensible architecture.

BeeAI’s primary advantage is its lightweight design that makes it easy to create and deploy AI agents without unnecessary complexity, while maintaining powerful capabilities for production use.

BeeAI tracing

Getting started

To use the BeeAI integration with Opik, you will need to have BeeAI and the required OpenTelemetry packages installed:

$pip install beeai-framework openinference-instrumentation-beeai "beeai-framework[wikipedia]" opentelemetry-api opentelemetry-sdk opentelemetry-exporter-otlp

Environment configuration

Configure your environment variables based on your Opik deployment:

If you are using Opik Cloud, you will need to set the following environment variables:

$export OTEL_EXPORTER_OTLP_ENDPOINT=https://www.comet.com/opik/api/v1/private/otel
>export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default'

To log the traces to a specific project, you can add the projectName parameter to the OTEL_EXPORTER_OTLP_HEADERS environment variable:

$export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default,projectName=<your-project-name>'

You can also update the Comet-Workspace parameter to a different value if you would like to log the data to a different workspace.

Using Opik with BeeAI

Set up OpenTelemetry instrumentation for BeeAI:

1from opentelemetry import trace
2from opentelemetry.sdk.trace import TracerProvider
3from opentelemetry.sdk.trace.export import BatchSpanProcessor
4from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
5from openinference.instrumentation.beeai import (
6 BeeAIInstrumentor,
7) # or SemanticKernelInstrumentor
8
9# Configure the OTLP exporter for Opik
10otlp_exporter = OTLPSpanExporter()
11
12
13# Set up the tracer provider
14trace.set_tracer_provider(TracerProvider())
15
16trace.get_tracer_provider().add_span_processor(
17 BatchSpanProcessor(otlp_exporter) # OTLP for sending to Opik
18)
19
20# Instrument your framework
21BeeAIInstrumentor().instrument() # or SemanticKernelInstrumentor().instrument()
22
23import asyncio
24from beeai_framework.agents.react import ReActAgent
25from beeai_framework.agents.types import AgentExecutionConfig
26from beeai_framework.backend.chat import ChatModel
27from beeai_framework.backend.types import ChatModelParameters
28from beeai_framework.memory import TokenMemory
29from beeai_framework.tools.search.wikipedia import WikipediaTool
30from beeai_framework.tools.weather.openmeteo import OpenMeteoTool
31
32# Initialize the language model
33llm = ChatModel.from_name(
34 "openai:gpt-4o-mini", # or "ollama:granite3.3:8b" for local Ollama
35 ChatModelParameters(temperature=0.7),
36)
37
38# Create tools for the agent
39tools = [
40 WikipediaTool(),
41 OpenMeteoTool(),
42]
43
44# Create a ReAct agent with memory
45agent = ReActAgent(llm=llm, tools=tools, memory=TokenMemory(llm))
46
47
48# Run the agent
49async def main():
50 response = await agent.run(
51 prompt="I'm planning a trip to Barcelona, Spain. Can you research key attractions and landmarks I should visit, and also tell me what the current weather conditions are like there?",
52 execution=AgentExecutionConfig(
53 max_retries_per_step=3, total_max_retries=10, max_iterations=5
54 ),
55 )
56 print("Agent Response:", response.result.text)
57 return response
58
59
60# Run the example
61if __name__ == "__main__":
62 asyncio.run(main())

Further improvements

If you have any questions or suggestions for improving the BeeAI integration, please open an issue on our GitHub repository.