Observability for BeeAI (TypeScript) with Opik

BeeAI is an agent framework designed to simplify the development of AI agents with a focus on simplicity and performance. It provides a clean API for building agents with built-in support for tool usage, conversation management, and extensible architecture.

BeeAI’s primary advantage is its lightweight design that makes it easy to create and deploy AI agents without unnecessary complexity, while maintaining powerful capabilities for production use.

BeeAI TypeScript tracing

Getting started

To use the BeeAI integration with Opik, you will need to have BeeAI and the required OpenTelemetry packages installed.

Installation

Option 1: Using npm

$npm install beeai-framework@0.1.13 @ai-sdk/openai @arizeai/openinference-instrumentation-beeai @opentelemetry/sdk-node dotenv

Option 2: Using yarn

$yarn add beeai-framework@0.1.13 @ai-sdk/openai @arizeai/openinference-instrumentation-beeai @opentelemetry/sdk-node dotenv

Version Compatibility: The BeeAI instrumentation currently supports beeai-framework version 0.1.13. Using a newer version may cause compatibility issues.

Requirements

  • Node.js ≥ 18
  • BeeAI Framework (beeai-framework)
  • OpenInference Instrumentation for BeeAI (@arizeai/openinference-instrumentation-beeai)
  • OpenTelemetry SDK for Node.js (@opentelemetry/sdk-node)

Environment configuration

Configure your environment variables based on your Opik deployment:

$# Your LLM API key
>export OPENAI_API_KEY="your-openai-api-key"
>
># Opik configuration
>export OTEL_EXPORTER_OTLP_ENDPOINT=https://www.comet.com/opik/api/v1/private/otel
>export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default'

To log the traces to a specific project, you can add the projectName parameter to the OTEL_EXPORTER_OTLP_HEADERS environment variable:

$export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default,projectName=<your-project-name>'

You can also update the Comet-Workspace parameter to a different value if you would like to log the data to a different workspace.

Using Opik with BeeAI

Set up OpenTelemetry instrumentation for BeeAI:

1import "dotenv/config";
2import { NodeSDK } from "@opentelemetry/sdk-node";
3import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-http";
4import { BeeAIInstrumentation } from "@arizeai/openinference-instrumentation-beeai";
5import * as beeaiFramework from "beeai-framework";
6
7// Initialize BeeAI Instrumentation
8const beeAIInstrumentation = new BeeAIInstrumentation();
9
10// Configure and start the OpenTelemetry SDK
11const sdk = new NodeSDK({
12 traceExporter: new OTLPTraceExporter(),
13 instrumentations: [beeAIInstrumentation],
14});
15sdk.start();
16
17// Manually patch BeeAI framework (required for trace collection)
18beeAIInstrumentation.manuallyInstrument(beeaiFramework);
19
20// Now you can use BeeAI as normal
21import { ReActAgent } from "beeai-framework/agents/react";
22import { OpenAIChatModel } from "beeai-framework/adapters/openai/backend/chat";
23import { WikipediaTool } from "beeai-framework/tools/search/wikipedia";
24import { OpenMeteoTool } from "beeai-framework/tools/weather/openmeteo";
25import { TokenMemory } from "beeai-framework/memory";
26
27// Initialize the OpenAI language model
28const llm = new OpenAIChatModel("gpt-4o-mini", {
29 temperature: 0.7,
30});
31
32// Create tools for the agent
33const tools = [
34 new WikipediaTool(),
35 new OpenMeteoTool(),
36];
37
38// Create a ReAct agent with memory
39const agent = new ReActAgent({
40 llm,
41 tools,
42 memory: new TokenMemory({ llm }),
43});
44
45// Run the agent
46async function main() {
47 const response = await agent.run({
48 prompt: "I'm planning a trip to Barcelona, Spain. Can you research key attractions and landmarks I should visit, and also tell me what the current weather conditions are like there?",
49 execution: {
50 maxRetriesPerStep: 3,
51 totalMaxRetries: 10,
52 maxIterations: 5,
53 },
54 });
55
56 console.log("Agent Response:", response.result.text);
57 return response;
58}
59
60// Run the example
61main();

Further improvements

If you have any questions or suggestions for improving the BeeAI integration, please open an issue on our GitHub repository.