Vercel AI SDK
Integrate Orq.ai with Vercel AI SDK using OpenTelemetry
Getting Started
The Vercel AI SDK provides powerful React hooks and utilities for building AI-powered applications with built-in OpenTelemetry support. The SDK includes experimental telemetry features that automatically capture detailed traces of AI operations, making integration with Orq.ai straightforward for comprehensive observability.
Prerequisites
Before you begin, ensure you have:
- An Orq.ai account and an API Key.
- Vercel AI SDK v3.1+ (with telemetry support).
- Node.js 18+ and TypeScript support.
- API keys for your LLM providers (OpenAI, Anthropic, etc.).
Install Dependencies
# Core Vercel AI SDK with latest version
npm install ai@latest
# OpenTelemetry packages
npm install @opentelemetry/api @opentelemetry/sdk-node @opentelemetry/exporter-trace-otlp-http
npm install @opentelemetry/instrumentation @opentelemetry/resources
npm install @opentelemetry/semantic-conventions
# Provider SDKs (choose what you need)
npm install @ai-sdk/openai @ai-sdk/anthropic @ai-sdk/google
# Optional: For React applications
npm install @ai-sdk/react
Configure Orq.ai
Set up your environment variables to connect to Orq.ai's OpenTelemetry collector:
Unix/Linux/macOS:
export OTEL_EXPORTER_OTLP_ENDPOINT="https://api.orq.ai/v2/otel"
export OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer <ORQ_API_KEY>"
export OTEL_RESOURCE_ATTRIBUTES="service.name=vercel-ai-app,service.version=1.0.0"
export OPENAI_API_KEY="<YOUR_OPENAI_API_KEY>"
Windows (PowerShell):
$env:OTEL_EXPORTER_OTLP_ENDPOINT = "https://api.orq.ai/v2/otel"
$env:OTEL_EXPORTER_OTLP_HEADERS = "Authorization=Bearer <ORQ_API_KEY>"
$env:OTEL_RESOURCE_ATTRIBUTES = "service.name=vercel-ai-app,service.version=1.0.0"
$env:OPENAI_API_KEY = "<YOUR_OPENAI_API_KEY>"
Using .env file:
OTEL_EXPORTER_OTLP_ENDPOINT=https://api.orq.ai/v2/otel
OTEL_EXPORTER_OTLP_HEADERS=Authorization=Bearer <ORQ_API_KEY>
OTEL_RESOURCE_ATTRIBUTES=service.name=vercel-ai-app,service.version=1.0.0
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
Integrations
The Vercel AI SDK has built-in OpenTelemetry support through the experimental_telemetry
option. Here's how to integrate it with Orq.ai:
Built-in Telemetry (Recommended)
The simplest way to enable telemetry is using the SDK's native support:
// instrumentation.js
import { registerOTel, OTLPHttpJsonTraceExporter } from '@vercel/otel';
export function register() {
registerOTel({
serviceName: 'your-project-name',
traceExporter: new OTLPHttpJsonTraceExporter({
url: 'https://api.orq.ai/v2/otel/v1/traces',
headers: {
'Authorization': "Bearer $ORQ_API_KEY",
},
}),
});
}
// index.js
import './instrumentation.js'
import { generateText, streamText, generateObject } from "ai";
import { openai } from "@ai-sdk/openai";
// Simple usage with telemetry enabled
const result = await generateText({
model: openai("gpt-4.1"),
prompt: "Write a short story about a robot",
experimental_telemetry: {
isEnabled: true,
},
});
// Advanced configuration with custom metadata
const resultWithMetadata = await generateText({
model: openai("gpt-4.1"),
prompt: "Explain quantum computing",
experimental_telemetry: {
isEnabled: true,
functionId: "quantum-explanation",
metadata: {
userId: "user-123",
requestId: "req-456",
environment: "production",
},
},
});
// Control what data is recorded
const resultWithPrivacy = await generateText({
model: openai("gpt-4.1"),
prompt: userPrompt,
experimental_telemetry: {
isEnabled: true,
recordInputs: false, // Don't record prompts
recordOutputs: false, // Don't record responses
},
});
Here the main factor to enable telemetry is to include the following payload when generating text. This can be used across the board.
experimental_telemetry: {
isEnabled: true,
recordInputs: false, // Don't record prompts
recordOutputs: false, // Don't record responses
},
Updated 3 days ago