Observability
Instrument your code with OpenTelemetry to capture traces, logs, and metrics for every LLM call, agent step, and tool use.
Observability
Getting Started
LiteLLM provides a unified interface for multiple LLM providers, enabling seamless switching between OpenAI, Anthropic, Cohere, and 100+ other providers. Tracing LiteLLM with Orq.ai gives you comprehensive insights into provider performance, cost optimization, routing decisions, and API reliability across your multi-provider setup.Prerequisites
Before you begin, ensure you have:- An Orq.ai account and API Key
- LiteLLM installed in your project
- Python 3.8+
- API keys for your LLM providers (OpenAI, Anthropic, Cohere, etc.)
Install Dependencies
Configure Orq.ai
Set up your environment variables to connect to Orq.ai’s OpenTelemetry collector: Unix/Linux/macOS:Integrations
Choose your preferred OpenTelemetry framework for collecting traces:LiteLLM
Auto-instrumentation with minimal setup:Examples
Basic Multi-Provider UsageView Traces
Head to the Traces tab to view LiteLLM traces in the Orq.ai Studio.