AI Router
Overview
The Vercel AI SDK is a TypeScript toolkit for building AI-powered applications with streaming, structured outputs, and multi-model support. By connecting it to Orq.ai’s AI Router via the@orq-ai/vercel-provider, you get access to 300+ models with a single provider setup.
Key Benefits
Orq.ai’s AI Router enhances your Vercel AI applications with:Complete Observability
Track every generation, stream, and structured output with detailed traces
Built-in Reliability
Automatic fallbacks, retries, and load balancing for production resilience
Cost Optimization
Real-time cost tracking and spend management across all your AI operations
Multi-Provider Access
Access 300+ LLMs and 20+ providers through a single, unified integration
Prerequisites
Before integrating Vercel AI with Orq.ai, ensure you have:- An Orq.ai account and API Key
- Node.js 18 or higher
To setup your API key, see API keys & Endpoints.
Installation
Configuration
Configure the Orq.ai provider with your API key:TypeScript
base_url: https://api.orq.ai/v2/router
Text Generation
TypeScript
Streaming Responses
TypeScript
Structured Output
Use a JSON system prompt and parse the response:TypeScript
Model Selection
With Orq.ai, you can use any supported model from 20+ providers:TypeScript
Observability
Getting Started
The Vercel AI SDK provides powerful React hooks and utilities for building AI-powered applications with built-in OpenTelemetry support. The SDK includes experimental telemetry features that automatically capture detailed traces of AI operations, making integration with Orq.ai straightforward for comprehensive observability.Prerequisites
Before you begin, ensure you have:- An Orq.ai account and an API Key.
- Vercel AI SDK v3.1+ (with telemetry support).
- Node.js 18+ and TypeScript support.
- API keys for your LLM providers (OpenAI, Anthropic, etc.).
Install Dependencies
Configure Orq.ai
Set up your environment variables to connect to Orq.ai’s OpenTelemetry collector: Unix/Linux/macOS:Integrations
The Vercel AI SDK has built-in OpenTelemetry support through theexperimental_telemetry option. Here’s how to integrate it with Orq.ai:
Built-in Telemetry (Recommended)
The simplest way to enable telemetry is using the SDK’s native support:Asset Capture in the Control Tower
When you instrument Vercel AI SDK with OpenTelemetry and send traces to Orq.ai, agents, tools, and models are automatically extracted from the spans and registered in Control Tower.Installation
Configuration
TypeScript
Agent Detection
The Vercel AI SDK does not have a built-in way to mark a span as an agent. To capture agents in Control Tower, use manual OpenTelemetry instrumentation withtracer.startActiveSpan(). The span name (e.g., "translator-agent") becomes the agent name in Control Tower. This approach also captures tools and models, and avoids the OpenAI Responses API which has limitations with tool schemas.
Captures: agent/translator-agent, tool/translate, model/gpt-4o
TypeScript