Mastra is a TypeScript framework for building AI-powered applications with pipelines, agents, and workflows. By connecting Mastra to Orq.ai’s AI Router, you transform experimental AI applications into production-ready systems with enterprise-grade capabilities.
Here’s a complete example of creating and running a Mastra agent through Orq.ai:
TypeScript
Copy
import { createOpenAI } from "@ai-sdk/openai";import { Agent } from "@mastra/core/agent";// Configure provider with Orq.ai AI Routerconst orqProvider = createOpenAI({ apiKey: process.env.ORQ_API_KEY!, baseURL: "https://api.orq.ai/v2/router",});// Create agentexport const assistantAgent = new Agent({ name: "Assistant", instructions: "You are a helpful assistant that explains complex concepts simply.", model: orqProvider("gpt-4o"),});// Run the agentasync function main() { const result = await assistantAgent.generate("Explain quantum computing in simple terms"); console.log(result.text);}main().catch(console.error);
Tool Calling Limitation: Mastra’s tool call format in the Responses API currently has schema incompatibilities with the AI Router. Basic agent usage works, but tool-based workflows require using the Observability integration or direct provider access.
With Orq.ai, you can use any supported model from 20+ providers:
TypeScript
Copy
import { createOpenAI } from "@ai-sdk/openai";import { Agent } from "@mastra/core/agent";// Configure providerconst orqProvider = createOpenAI({ apiKey: process.env.ORQ_API_KEY!, baseURL: "https://api.orq.ai/v2/router",});// Use Claudeexport const claudeAgent = new Agent({ name: "Claude Assistant", model: orqProvider("claude-sonnet-4-5-20250929"), instructions: "You are a helpful assistant.",});// Use Geminiexport const geminiAgent = new Agent({ name: "Gemini Assistant", model: orqProvider("gemini-2.5-flash"), instructions: "You are a helpful assistant.",});// Use any other modelexport const groqAgent = new Agent({ name: "Groq Assistant", model: orqProvider("llama-3.3-70b-versatile"), instructions: "You are a helpful assistant.",});// Run with different modelsasync function main() { const result = await claudeAgent.generate("Explain machine learning"); console.log(result.text);}main().catch(console.error);
Integrate Mastra with Orq.ai’s observability to gain complete insights into pipeline execution, agent performance, workflow orchestration, and system reliability using OpenTelemetry.
In your Mastra server configuration, enable export tracing. The previously set environment variables will be used.
TypeScript
Copy
export const mastra = new Mastra({ // ... other config telemetry: { serviceName: "my-app", enabled: true, export: { type: "otlp", // endpoint and headers will be picked up from env vars }, },});