AI Router
Route your LLM calls through the AI Router with a single endpoint change. Zero vendor lock-in: always run on the best model at the lowest cost for your use case.
Observability
Instrument your code with OpenTelemetry to capture traces, logs, and metrics for every LLM call, agent step, and tool use.
AI Router
Overview
Use Azure AI Inference SDK to route all model calls through Orq.ai’s AI Router. PointChatCompletionsClient at Orq’s endpoint to access 250+ models from 20+ providers — OpenAI, Anthropic, Google, and more — without changing your agent logic.
Key Benefits
Complete Observability
Track every agent step, tool use, and LLM call with detailed traces and analytics
Built-in Reliability
Automatic fallbacks, retries, and load balancing for production resilience
Cost Optimization
Real-time cost tracking and spend management across all your AI operations
Multi-Provider Access
Access 250+ LLMs and 20+ providers through a single, unified integration
Prerequisites
- An Orq.ai account and API Key
- Python 3.9 or higher
To set up your API key, see API keys & Endpoints.
Installation
Configuration
ConfigureChatCompletionsClient to point at Orq.ai’s AI Router:
Python
endpoint: https://api.orq.ai/v2/router
Basic Example
Python
Agent with Function Tools
ChatCompletionsClient supports multi-turn tool calling. The agent loop runs until no more tool calls are returned:
Python
Model Selection
Switch models by changing themodel parameter. All 250+ models are available through the same client:
Python
Observability
Overview
Instrument your Azure AI Agents application with OpenTelemetry to send traces to Orq.ai. Theazure-core-tracing-opentelemetry package hooks into the Azure SDK’s distributed tracing mechanism, automatically capturing spans for every agent call, thread operation, and LLM invocation.
Prerequisites
- An Orq.ai account and API Key
- Azure AI Foundry project with an agent deployed
- Python 3.9+
AZURE_AI_PROJECT_ENDPOINT— your Azure AI Foundry project endpointAZURE_AI_MODEL_DEPLOYMENT_NAME— the model deployment name in your Foundry project
Install Dependencies
Configuration
Python
Basic Example
Python
View Traces
View your traces in the AI Studio in the Traces tab.Visit your AI Studio to view real-time analytics and traces.