Vercel AI
Use the Orq.ai AI Gateway with Vercel AI SDK for streamable AI interfaces
Prerequisites
To start using Vercel AI SDK with Orq, you need an API Key ready within your Orq.ai account.
To setup your API key, see API keys & Endpoints.
To use libraries with private models, see Onboarding Private Models.
Using Orq.ai as AI Gateway
Using the Vercel AI SDK with the Orq.ai provider, you can seamlessly integrate AI models from the Orq AI platform with full Vercel AI SDK compatibility.
Using the Orq.ai AI Gateway, you benefit from the Platform Traces, Cost and Usage Monitoring, keeping full compatibility and a unified API with all models while using the Vercel AI SDK.
Installation:
npm install @orq-ai/vercel-provider ai
Text Generation
import { createOrqAiProvider } from "@orq-ai/vercel-provider";
import { generateText } from "ai";
const orq = createOrqAiProvider({
apiKey: process.env.ORQ_API_KEY
});
const { text } = await generateText({
model: orq("openai/gpt-4o"),
prompt: "Write a haiku about programming"
});
console.log(text);
Streaming Responses
import { createOrqAiProvider } from "@orq-ai/vercel-provider";
import { streamText } from "ai";
const orq = createOrqAiProvider({
apiKey: process.env.ORQ_API_KEY
});
const { textStream } = await streamText({
model: orq("openai/gpt-4o-mini"),
messages: [
{ role: "system", content: "You are a helpful assistant." },
{ role: "user", content: "Hello!" }
]
});
for await (const chunk of textStream) {
process.stdout.write(chunk);
}
Structured Output
import { createOrqAiProvider } from "@orq-ai/vercel-provider";
import { generateObject } from "ai";
import { z } from "zod";
const orq = createOrqAiProvider({
apiKey: process.env.ORQ_API_KEY
});
const { object } = await generateObject({
model: orq("openai/gpt-4o"),
schema: z.object({
name: z.string(),
age: z.number(),
city: z.string()
}),
prompt: "Generate information for a random person"
});
console.log(object);
Configuration Options
The Orq.ai provider supports the following configuration options:
const orq = createOrqAiProvider({
apiKey: "your-api-key", // Required: Orq API key
baseURL: "https://api.orq.ai/v2/proxy", // Optional: Custom API endpoint
headers: { // Optional: Additional headers
"X-Custom-Header": "value",
},
});
Supported Model Types
- Chat models: For conversational AI applications
- Completion models: For text generation tasks
- Text embeddings: For semantic search and similarity
- Image generation: For creating images from text prompts
Updated about 3 hours ago