Orq MCP is live: Use natural language to interrogate traces, spot regressions, and experiment your way to optimal AI configurations. Available in Claude Desktop, Claude Code, Cursor, and more. Start now →
AWS Strands Integration | AI Router & OpenTelemetry
Connect AWS Strands Agents to Orq.ai’s AI Router with OpenTelemetry observability. Access 300+ LLMs, built-in reliability, and complete tracing of agent interactions.
AI Router
Route your LLM calls through the AI Router with a single base URL change. Zero vendor lock-in: always run on the best model at the lowest cost for your use case.
Observability
Instrument your code with OpenTelemetry to capture traces, logs, and metrics for every LLM call, agent step, and tool use.
AWS Strands is a framework for building AI agents with structured reasoning and tool use. By connecting AWS Strands to Orq.ai’s AI Router, you transform experimental agents into production-ready systems with enterprise-grade capabilities.
Configure Strands Agents to use Orq.ai’s AI Router by passing custom client arguments with the base URL:
Copy
Ask AI
from strands import Agentfrom strands.models.openai import OpenAIModelimport os# Configure model with Orq.ai AI Routermodel = OpenAIModel( model_id="gpt-4o", client_args={ "api_key": os.getenv('ORQ_API_KEY'), "base_url": "https://api.orq.ai/v2/router" })# Create agent with Orq.ai-powered modelagent = Agent( model=model, system_prompt="You are a helpful AI assistant.")
Here’s a complete example of creating and running a Strands agent through Orq.ai:
Copy
Ask AI
from strands import Agentfrom strands.models.openai import OpenAIModelimport os# Configure model with Orq.ai AI Routermodel = OpenAIModel( model_id="gpt-4o", client_args={ "api_key": os.getenv('ORQ_API_KEY'), "base_url": "https://api.orq.ai/v2/router" })# Create a simple agentagent = Agent( model=model, system_prompt="You are a research assistant that helps users find and summarize information.")# Run the agentresult = agent("Explain quantum computing in simple terms")print(result)
Strands agents can use tools while routing through Orq.ai:
Copy
Ask AI
from strands import Agent, toolfrom strands.models.openai import OpenAIModelimport os# Configure modelmodel = OpenAIModel( model_id="gpt-4o", client_args={ "api_key": os.getenv('ORQ_API_KEY'), "base_url": "https://api.orq.ai/v2/router" })# Define a custom tool using the @tool decorator@tooldef search_database(query: str) -> str: """Search the knowledge database for relevant information.""" # Your database search logic here return f"Search results for: {query}"# Create agent with toolsagent = Agent( model=model, tools=[search_database], system_prompt="You are a knowledge assistant. Use the search_database tool to find information when needed.")# Run agent with tool accessresult = agent("Find information about machine learning best practices")print(result)
AWS Strands integrates with Orq.ai’s Observability Platform through OpenTelemetry. Capture complete traces of your agent interactions, tool calls, and model invocations to gain deep insights into agent behavior, performance, and costs.
The examples below use OpenAI models with OpenTelemetry tracing sent to Orq.ai. Set both OPENAI_API_KEY (for model access) and ORQ_API_KEY (for telemetry export) environment variables.