AI Router
Overview
LangGraph is a framework for building stateful, multi-actor AI applications with LLMs. It extends LangChain with graph-based agent orchestration, cycles, and controllability. By connecting LangGraph to Orq.ai’s AI Router, you get production-ready agentic workflows with access to 300+ models.Key Benefits
Orq.ai’s AI Router enhances your LangGraph applications with:Complete Observability
Track every agent step, tool use, and graph transition with detailed traces
Built-in Reliability
Automatic fallbacks, retries, and load balancing for production resilience
Cost Optimization
Real-time cost tracking and spend management across all your AI operations
Multi-Provider Access
Access 300+ LLMs and 20+ providers through a single, unified integration
Prerequisites
Before integrating LangGraph with Orq.ai, ensure you have:- An Orq.ai account and API Key
- Python 3.8 or higher
To setup your API key, see API keys & Endpoints.
Installation
Configuration
Configure LangGraph to use Orq.ai’s AI Router by passing aChatOpenAI instance with a custom base_url:
Python
base_url: https://api.orq.ai/v2/router
Basic Agent Example
Here’s a complete example usingcreate_agent with a tool:
Python
Agent with Multiple Tools
Python
Streaming
Stream agent steps as they happen:Python
Model Selection
With Orq.ai, you can use any supported model from 20+ providers:Python
Observability
Asset Capture in the Control Tower
When you instrument LangGraph with OpenTelemetry and send traces to Orq.ai, agents, tools, and models are automatically extracted from the spans and registered in Control Tower.Installation
Configuration
Python
Examples
Agent with a single tool — capturesagent/weather_agent, tool/get_weather, model/gpt-4o-mini
Python
agent/assistant_agent, tool/calculator, tool/get_time, model/gpt-4o-mini
Python
agent/smart_agent, model/gpt-4o
Python