LangChain / LangGraph
Integrate Orq.ai with LangChain and LangGraph using OpenTelemetry
Getting Started
Step 1: Install dependencies
Run the following command to install the required libraries:
pip install langchain langchain-openai langgraph
Step 2: Configure Environment Variables
You will need: • An Orq.ai API Key (from your Orq.ai workspace). If you don’t have an account, sign up here. • An API key for the LLM provider you’ll be using (e.g., OpenAI).
Set the following environment variables:
import os
# Orq.ai OpenTelemetry exporter
os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = "https://api.orq.ai/v2/otel"
os.environ["OTEL_EXPORTER_OTLP_HEADERS"] = "Authorization=Bearer $ORQ_API_KEY"
# Enable LangSmith tracing in OTEL-only mode
os.environ["LANGSMITH_OTEL_ENABLED"] = "true"
os.environ["LANGSMITH_TRACING"] = "true"
os.environ["LANGSMITH_OTEL_ONLY"] = "true"
# OpenAI API key
os.environ["OPENAI_API_KEY"] = "$OPENAI_API_KEY"
Once set, all LangChain and LangGraph traces will automatically be sent to your Orq.ai workspace.
Step 3: Sending traces to Orq
Here’s an example of running a simple LangGraph ReAct agent with a custom tool..
from langgraph.prebuilt import create_react_agent
def get_weather(city: str) -> str:
"""Get weather for a given city."""
return f"It's always sunny in {city}!"
agent = create_react_agent(
model="openai:gpt-5-mini",
tools=[get_weather],
prompt="You are a helpful assistant.",
)
# Run the agent — this will generate traces in Orq.ai
agent.invoke(
{"messages": [{"role": "user", "content": "What is the weather in San Francisco?"}]}
)
More Examples
Sending Traces with LangChain
The following snippet shows how to create and run a simple LangChain app that also sends traces to Orq.ai.
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate
# Define a prompt and chain
prompt = ChatPromptTemplate.from_template("Tell me a {action} about {topic}")
model = ChatOpenAI(temperature=0.7)
chain = prompt | model
# Invoke the chain
result = chain.invoke({"topic": "programming", "action": "joke"})
print(result.content)
At this point, you should see traces from both LangGraph and LangChain examples appear in your Orq.ai workspace.
Updated 11 days ago