Skip to main content

AI Router

Overview

LangGraph is a framework for building stateful, multi-actor AI applications with LLMs. It extends LangChain with graph-based agent orchestration, cycles, and controllability. By connecting LangGraph to Orq.ai’s AI Router, you get production-ready agentic workflows with access to 300+ models.

Key Benefits

Orq.ai’s AI Router enhances your LangGraph applications with:

Complete Observability

Track every agent step, tool use, and graph transition with detailed traces

Built-in Reliability

Automatic fallbacks, retries, and load balancing for production resilience

Cost Optimization

Real-time cost tracking and spend management across all your AI operations

Multi-Provider Access

Access 300+ LLMs and 20+ providers through a single, unified integration

Prerequisites

Before integrating LangGraph with Orq.ai, ensure you have:
  • An Orq.ai account and API Key
  • Python 3.8 or higher
To setup your API key, see API keys & Endpoints.

Installation

pip install langgraph langchain-openai langchain-core

Configuration

Configure LangGraph to use Orq.ai’s AI Router by passing a ChatOpenAI instance with a custom base_url:
Python
from langchain_openai import ChatOpenAI
import os

llm = ChatOpenAI(
    model="gpt-4o",
    api_key=os.getenv("ORQ_API_KEY"),
    base_url="https://api.orq.ai/v2/router",
)
base_url: https://api.orq.ai/v2/router

Basic Agent Example

Here’s a complete example using create_agent with a tool:
Python
from langchain.agents import create_agent
from langchain_openai import ChatOpenAI
from langchain_core.tools import tool
import os

llm = ChatOpenAI(
    model="gpt-4o",
    api_key=os.getenv("ORQ_API_KEY"),
    base_url="https://api.orq.ai/v2/router",
)

@tool
def get_weather(location: str) -> str:
    """Get the current weather for a location."""
    return f"The weather in {location} is sunny and 72°F"

agent = create_agent(llm, tools=[get_weather])

result = agent.invoke({"messages": [("user", "What's the weather in San Francisco?")]})
print(result["messages"][-1].content)

Agent with Multiple Tools

Python
from langchain.agents import create_agent
from langchain_openai import ChatOpenAI
from langchain_core.tools import tool
import os

llm = ChatOpenAI(
    model="gpt-4o",
    api_key=os.getenv("ORQ_API_KEY"),
    base_url="https://api.orq.ai/v2/router",
)

@tool
def get_weather(location: str) -> str:
    """Get the current weather for a location."""
    return f"The weather in {location} is sunny and 72°F"

@tool
def add(a: int, b: int) -> int:
    """Add two integers."""
    return a + b

@tool
def multiply(a: int, b: int) -> int:
    """Multiply two integers."""
    return a * b

agent = create_agent(
    llm,
    tools=[get_weather, add, multiply],
    system_prompt="You are a helpful assistant with access to weather and math tools.",
)

result = agent.invoke({
    "messages": [("user", "What is 15 * 4? Also check the weather in Tokyo.")]
})
print(result["messages"][-1].content)

Streaming

Stream agent steps as they happen:
Python
from langchain.agents import create_agent
from langchain_openai import ChatOpenAI
from langchain_core.tools import tool
import os

llm = ChatOpenAI(
    model="gpt-4o",
    api_key=os.getenv("ORQ_API_KEY"),
    base_url="https://api.orq.ai/v2/router",
)

@tool
def get_weather(location: str) -> str:
    """Get the current weather for a location."""
    return f"The weather in {location} is sunny and 72°F"

agent = create_agent(llm, tools=[get_weather])

for chunk in agent.stream(
    {"messages": [("user", "What's the weather in Paris?")]},
    stream_mode="updates",
):
    print(chunk)

Model Selection

With Orq.ai, you can use any supported model from 20+ providers:
Python
from langchain.agents import create_agent
from langchain_openai import ChatOpenAI
from langchain_core.tools import tool
import os

@tool
def get_weather(location: str) -> str:
    """Get the current weather for a location."""
    return f"The weather in {location} is sunny and 72°F"

# Use Claude
claude_agent = create_agent(
    ChatOpenAI(
        model="claude-sonnet-4-5-20250929",
        api_key=os.getenv("ORQ_API_KEY"),
        base_url="https://api.orq.ai/v2/router",
    ),
    tools=[get_weather],
)

# Use Gemini
gemini_agent = create_agent(
    ChatOpenAI(
        model="gemini-2.5-flash",
        api_key=os.getenv("ORQ_API_KEY"),
        base_url="https://api.orq.ai/v2/router",
    ),
    tools=[get_weather],
)

result = claude_agent.invoke({"messages": [("user", "What's the weather in London?")]})
print(result["messages"][-1].content)