Skip to main content

Observability

Instrument your code with OpenTelemetry to capture traces, logs, and metrics for every LLM call, agent step, and tool use.

Observability

Overview

BeeAI is IBM’s open-source agent framework for building production-ready multi-agent systems. It uses the openinference-instrumentation-beeai library to export traces via OpenTelemetry.

Prerequisites

  • An Orq.ai account and API Key
  • Python 3.10+
  • An OpenAI API key (OPENAI_API_KEY) — the examples use OpenAI models

Installation

pip install beeai-framework \
    opentelemetry-api \
    opentelemetry-sdk \
    "opentelemetry-exporter-otlp-proto-http" \
    openinference-instrumentation-beeai

Configuring Orq.ai Observability

import os
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from openinference.instrumentation.beeai import BeeAIInstrumentor

exporter = OTLPSpanExporter(
    endpoint="https://api.orq.ai/v2/otel/v1/traces",
    headers={"Authorization": f"Bearer {os.environ['ORQ_API_KEY']}"},
)
tracer_provider = TracerProvider()
tracer_provider.add_span_processor(BatchSpanProcessor(exporter))
BeeAIInstrumentor().instrument(tracer_provider=tracer_provider)

Basic Example

import os
import asyncio
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from openinference.instrumentation.beeai import BeeAIInstrumentor

exporter = OTLPSpanExporter(
    endpoint="https://api.orq.ai/v2/otel/v1/traces",
    headers={"Authorization": f"Bearer {os.environ['ORQ_API_KEY']}"},
)
tracer_provider = TracerProvider()
tracer_provider.add_span_processor(BatchSpanProcessor(exporter))
BeeAIInstrumentor().instrument(tracer_provider=tracer_provider)

from beeai_framework.agents.react import ReActAgent
from beeai_framework.adapters.openai import OpenAIChatModel
from beeai_framework.memory import UnconstrainedMemory

async def main():
    agent = ReActAgent(
        llm=OpenAIChatModel("gpt-4o-mini"),
        memory=UnconstrainedMemory(),
        tools=[],
    )
    result = await agent.run("What is 2 + 2?")
    print(result.output.text)

asyncio.run(main())