Orq MCP is live: Use natural language to interrogate traces, spot regressions, and experiment your way to optimal AI configurations. Available in Claude Desktop, Claude Code, Cursor, and more. Start now →
Connect Langchain to the AI Router for enhanced LLM orchestration. Use Orq.ai as a drop-in provider for chains, agents, and RAG applications.
AI Router
Route your LLM calls through the AI Router with a single base URL change. Zero vendor lock-in: always run on the best model at the lowest cost for your use case.
Observability
Instrument your code with OpenTelemetry to capture traces, logs, and metrics for every LLM call, agent step, and tool use.
LangChain is a framework for building LLM-powered applications through composable chains, agents, and integrations with external data sources. By connecting LangChain to Orq.ai’s AI Router, you access 300+ models through a single base URL change.
from langchain_openai import ChatOpenAIimport osllm = ChatOpenAI( model="gpt-4o", api_key=os.getenv("ORQ_API_KEY"), base_url="https://api.orq.ai/v2/router",)for chunk in llm.stream("Write a short poem about the ocean."): print(chunk.content, end="", flush=True)print()
You will need: • An Orq.ai API Key (from your Orq.ai workspace). If you don’t have an account, sign up here. • An API key for the LLM provider you’ll be using (e.g., OpenAI).Set the following environment variables:
Here’s an example of running a simple LangGraph ReAct agent with a custom tool..
Copy
Ask AI
from langgraph.prebuilt import create_react_agentdef get_weather(city: str) -> str: """Get weather for a given city.""" return f"It's always sunny in {city}!"agent = create_react_agent( model="openai:gpt-5-mini", tools=[get_weather], prompt="You are a helpful assistant.",)# Run the agent — this will generate traces in Orq.aiagent.invoke( {"messages": [{"role": "user", "content": "What is the weather in San Francisco?"}]})
The following snippet shows how to create and run a simple LangChain app that also sends traces to Orq.ai.
Copy
Ask AI
from langchain_openai import ChatOpenAIfrom langchain_core.prompts import ChatPromptTemplate# Define a prompt and chainprompt = ChatPromptTemplate.from_template("Tell me a {action} about {topic}")model = ChatOpenAI(temperature=0.7)chain = prompt | model# Invoke the chainresult = chain.invoke({"topic": "programming", "action": "joke"})print(result.content)
At this point, you should see traces from both LangGraph and LangChain examples appear in your Orq.ai workspace.