AI Router
Route your LLM calls through the AI Router with a single base URL change. Zero vendor lock-in: always run on the best model at the lowest cost for your use case.
Observability
Instrument your code with OpenTelemetry to capture traces, logs, and metrics for every LLM call, agent step, and tool use.
AI Router
Overview
Microsoft AutoGen is a framework for building multi-agent conversational AI systems with collaborative problem-solving through automated agent interactions. By connecting AutoGen to Orq.ai’s AI Router, you get access to 300+ models for your multi-agent workflows with a single configuration change.Key Benefits
Orq.ai’s AI Router enhances your AutoGen applications with:Complete Observability
Track every agent conversation, tool use, and multi-agent interaction
Built-in Reliability
Automatic fallbacks, retries, and load balancing for production resilience
Cost Optimization
Real-time cost tracking and spend management across all your AI operations
Multi-Provider Access
Access 300+ LLMs and 20+ providers through a single, unified integration
Prerequisites
Before integrating AutoGen with Orq.ai, ensure you have:- An Orq.ai account and API Key
- Python 3.10 or higher
To setup your API key, see API keys & Endpoints.
Installation
Configuration
Configure AutoGen to use Orq.ai’s AI Router viaOpenAIChatCompletionClient with a custom base_url:
Python
base_url: https://api.orq.ai/v2/router
The
model_info dict is required when using a custom base_url so AutoGen knows the model’s capabilities.Basic Agent Example
Python
Multi-Agent Team
Orchestrate multiple specialized agents withRoundRobinGroupChat:
Python
Model Selection
With Orq.ai, you can use any supported model from 20+ providers:Python
Observability
Getting Started
Microsoft AutoGen enables sophisticated multi-agent conversations and collaborative AI systems. Tracing AutoGen with Orq.ai provides deep insights into agent interactions, conversation flows, tool usage, and multi-agent coordination patterns to optimize your conversational AI applications.Prerequisites
Before you begin, ensure you have:- An Orq.ai account and API Key
- Python 3.8+
- Microsoft AutoGen installed in your project
- OpenAI API key (or other LLM provider credentials)
Install Dependencies
Configure Orq.ai
Set up your environment variables to connect to Orq.ai’s OpenTelemetry collector: Unix/Linux/macOS:Integration
AutoGen has built-in OpenTelemetry support. Configure the tracer provider and pass it to the AutoGen runtime.Set up OpenTelemetry tracing in your application:
All AutoGen agent conversations and interactions will be instrumented and exported to Orq.ai through the OTLP exporter. For more details, see Traces.
Advanced Examples
Multi-Agent Group ChatAutogen is also usable through our AI Router, to learn more, see AutoGen Gateway.