Skip to main content

AI Router

Route your LLM calls through the AI Router with a single base URL change. Zero vendor lock-in: always run on the best model at the lowest cost for your use case.

Observability

Instrument your code with OpenTelemetry to capture traces, logs, and metrics for every LLM call, agent step, and tool use.

AI Router

Overview

AWS Strands is a framework for building AI agents with structured reasoning and tool use. By connecting AWS Strands to Orq.ai’s AI Router, you transform experimental agents into production-ready systems with enterprise-grade capabilities.

Key Benefits

Orq.ai’s AI Router enhances your AWS Strands Agents with:

Complete Observability

Track every agent step, tool use, and interaction with detailed traces and analytics

Built-in Reliability

Automatic fallbacks, retries, and load balancing for production resilience

Cost Optimization

Real-time cost tracking and spend management across all your AI operations

Multi-Provider Access

Access 300+ LLMs and 20+ providers through a single, unified integration

Prerequisites

Before integrating AWS Strands with Orq.ai, ensure you have:
  • An Orq.ai account and API Key
  • Python 3.8 or higher
  • AWS Strands SDK installed
To setup your API key, see API keys & Endpoints.

Installation

Install the Strands Agents SDK (requires Python 3.10+):
# Install Strands Agents SDK
pip install strands-agents

# Optional: Install additional tools
pip install strands-agents-tools

Configuration

Configure Strands Agents to use Orq.ai’s AI Router by passing custom client arguments with the base URL:
from strands import Agent
from strands.models.openai import OpenAIModel
import os

# Configure model with Orq.ai AI Router
model = OpenAIModel(
    model_id="gpt-4o",
    client_args={
        "api_key": os.getenv('ORQ_API_KEY'),
        "base_url": "https://api.orq.ai/v2/router"
    }
)

# Create agent with Orq.ai-powered model
agent = Agent(
    model=model,
    system_prompt="You are a helpful AI assistant."
)
base_url: https://api.orq.ai/v2/router

Basic Agent Example

Here’s a complete example of creating and running a Strands agent through Orq.ai:
from strands import Agent
from strands.models.openai import OpenAIModel
import os

# Configure model with Orq.ai AI Router
model = OpenAIModel(
    model_id="gpt-4o",
    client_args={
        "api_key": os.getenv('ORQ_API_KEY'),
        "base_url": "https://api.orq.ai/v2/router"
    }
)

# Create a simple agent
agent = Agent(
    model=model,
    system_prompt="You are a research assistant that helps users find and summarize information."
)

# Run the agent
result = agent("Explain quantum computing in simple terms")
print(result)

Agent with Tools

Strands agents can use tools while routing through Orq.ai:
from strands import Agent, tool
from strands.models.openai import OpenAIModel
import os

# Configure model
model = OpenAIModel(
    model_id="gpt-4o",
    client_args={
        "api_key": os.getenv('ORQ_API_KEY'),
        "base_url": "https://api.orq.ai/v2/router"
    }
)

# Define a custom tool using the @tool decorator
@tool
def search_database(query: str) -> str:
    """Search the knowledge database for relevant information."""
    # Your database search logic here
    return f"Search results for: {query}"

# Create agent with tools
agent = Agent(
    model=model,
    tools=[search_database],
    system_prompt="You are a knowledge assistant. Use the search_database tool to find information when needed."
)

# Run agent with tool access
result = agent("Find information about machine learning best practices")
print(result)

Fallback Configuration

Configure automatic fallbacks for reliability:
from strands import Agent
from strands.models.openai import OpenAIModel
import os

# Configure model - Orq.ai handles fallbacks automatically
model = OpenAIModel(
    model_id="gpt-4o",
    client_args={
        "api_key": os.getenv('ORQ_API_KEY'),
        "base_url": "https://api.orq.ai/v2/router"
    }
)

agent = Agent(
    model=model,
    system_prompt="You are a helpful assistant."
)

Observability

AWS Strands integrates with Orq.ai’s Observability Platform through OpenTelemetry. Capture complete traces of your agent interactions, tool calls, and model invocations to gain deep insights into agent behavior, performance, and costs.

Prerequisites

Before you begin, ensure you have:
  • An Orq.ai account and an API Key
  • AWS Strands SDK installed
  • Python 3.10+
The examples below use OpenAI models with OpenTelemetry tracing sent to Orq.ai. Set both OPENAI_API_KEY (for model access) and ORQ_API_KEY (for telemetry export) environment variables.

Installation

pip install strands-agents opentelemetry-api opentelemetry-sdk opentelemetry-exporter-otlp-proto-http

Configuration

Configure OpenTelemetry to send traces to Orq.ai:
Python
import os
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.resources import Resource

# Configure OTLP exporter
exporter = OTLPSpanExporter(
    endpoint="https://api.orq.ai/v2/otel/v1/traces",
    headers={"Authorization": f"Bearer {os.getenv('ORQ_API_KEY')}"}
)

# Create tracer provider with service name
resource = Resource(attributes={
    "service.name": "strands-agent-app"
})
provider = TracerProvider(resource=resource)
provider.add_span_processor(BatchSpanProcessor(exporter))

# Set as global tracer provider
trace.set_tracer_provider(provider)

Basic Example

Python
import os
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.resources import Resource
from strands import Agent
from strands.models.openai import OpenAIModel

# Setup OpenTelemetry
exporter = OTLPSpanExporter(
    endpoint="https://api.orq.ai/v2/otel/v1/traces",
    headers={"Authorization": f"Bearer {os.getenv('ORQ_API_KEY')}"}
)
resource = Resource(attributes={"service.name": "strands-agent"})
provider = TracerProvider(resource=resource)
provider.add_span_processor(BatchSpanProcessor(exporter))
trace.set_tracer_provider(provider)

# Configure model
model = OpenAIModel(
    model_id="gpt-4o",
    client_args={
        "api_key": os.getenv('OPENAI_API_KEY'),
    }
)

# Create agent
agent = Agent(
    model=model,
    system_prompt="You are a helpful research assistant."
)

# Run agent (traces automatically sent to Orq.ai)
result = agent("Explain quantum computing in simple terms")
print(result)

Agent with Tools

Python
import os
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.resources import Resource
from strands import Agent, tool
from strands.models.openai import OpenAIModel

# Setup OpenTelemetry
exporter = OTLPSpanExporter(
    endpoint="https://api.orq.ai/v2/otel/v1/traces",
    headers={"Authorization": f"Bearer {os.getenv('ORQ_API_KEY')}"}
)
resource = Resource(attributes={"service.name": "strands-agent-tools"})
provider = TracerProvider(resource=resource)
provider.add_span_processor(BatchSpanProcessor(exporter))
trace.set_tracer_provider(provider)

# Configure model
model = OpenAIModel(
    model_id="gpt-4o",
    client_args={
        "api_key": os.getenv('OPENAI_API_KEY'),
    }
)

# Define tools
@tool
def search_database(query: str) -> str:
    """Search the knowledge database for relevant information."""
    return f"Search results for: {query}"

@tool
def get_weather(location: str) -> str:
    """Get current weather for a location."""
    return f"Weather in {location}: Sunny, 72°F"

# Create agent with tools
agent = Agent(
    model=model,
    tools=[search_database, get_weather],
    system_prompt="You are a helpful assistant with access to search and weather tools."
)

# Run agent (all tool calls traced)
result = agent("What's the weather like in San Francisco?")
print(result)