Skip to main content

AI Router

Overview

Mastra is a TypeScript framework for building AI-powered applications with pipelines, agents, and workflows. By connecting Mastra to Orq.ai’s AI Router, you transform experimental AI applications into production-ready systems with enterprise-grade capabilities.

Key Benefits

Orq.ai’s AI Router enhances your Mastra applications with:

Complete Observability

Track every agent step, tool use, and interaction with detailed traces and analytics

Built-in Reliability

Automatic fallbacks, retries, and load balancing for production resilience

Cost Optimization

Real-time cost tracking and spend management across all your AI operations

Multi-Provider Access

Access 300+ LLMs and 20+ providers through a single, unified integration

Prerequisites

Before integrating Mastra with Orq.ai, ensure you have:
  • An Orq.ai account and API Key
  • Node.js 18 or higher
  • TypeScript support
  • Mastra installed in your project
To setup your API key, see API keys & Endpoints.

Installation

Install Mastra and the AI SDK:
npm install @mastra/core @ai-sdk/openai

Configuration

Configure Mastra to use Orq.ai’s AI Router by creating an OpenAI provider with a custom base URL:
TypeScript
import { createOpenAI } from "@ai-sdk/openai";
import { Agent } from "@mastra/core/agent";

// Configure OpenAI provider with Orq.ai AI Router
const orqProvider = createOpenAI({
  apiKey: process.env.ORQ_API_KEY,
  baseURL: "https://api.orq.ai/v2/router",
});

// Create agent with Orq.ai-powered model
const agent = new Agent({
  name: "Assistant",
  instructions: "You are a helpful assistant.",
  model: orqProvider("gpt-4o"),
});
baseURL: https://api.orq.ai/v2/router

Basic Agent Example

Here’s a complete example of creating and running a Mastra agent through Orq.ai:
TypeScript
import { createOpenAI } from "@ai-sdk/openai";
import { Agent } from "@mastra/core/agent";

// Configure provider with Orq.ai AI Router
const orqProvider = createOpenAI({
  apiKey: process.env.ORQ_API_KEY!,
  baseURL: "https://api.orq.ai/v2/router",
});

// Create agent
export const assistantAgent = new Agent({
  name: "Assistant",
  instructions: "You are a helpful assistant that explains complex concepts simply.",
  model: orqProvider("gpt-4o"),
});

// Run the agent
async function main() {
  const result = await assistantAgent.generate("Explain quantum computing in simple terms");
  console.log(result.text);
}

main().catch(console.error);
Tool Calling Limitation: Mastra’s tool call format in the Responses API currently has schema incompatibilities with the AI Router. Basic agent usage works, but tool-based workflows require using the Observability integration or direct provider access.

Model Selection

With Orq.ai, you can use any supported model from 20+ providers:
TypeScript
import { createOpenAI } from "@ai-sdk/openai";
import { Agent } from "@mastra/core/agent";

// Configure provider
const orqProvider = createOpenAI({
  apiKey: process.env.ORQ_API_KEY!,
  baseURL: "https://api.orq.ai/v2/router",
});

// Use Claude
export const claudeAgent = new Agent({
  name: "Claude Assistant",
  model: orqProvider("claude-sonnet-4-5-20250929"),
  instructions: "You are a helpful assistant.",
});

// Use Gemini
export const geminiAgent = new Agent({
  name: "Gemini Assistant",
  model: orqProvider("gemini-2.5-flash"),
  instructions: "You are a helpful assistant.",
});

// Use any other model
export const groqAgent = new Agent({
  name: "Groq Assistant",
  model: orqProvider("llama-3.3-70b-versatile"),
  instructions: "You are a helpful assistant.",
});

// Run with different models
async function main() {
  const result = await claudeAgent.generate("Explain machine learning");
  console.log(result.text);
}

main().catch(console.error);

Observability

Getting Started

Integrate Mastra with Orq.ai’s observability to gain complete insights into pipeline execution, agent performance, workflow orchestration, and system reliability using OpenTelemetry.

Prerequisites

Before you begin, ensure you have:
  • An Orq.ai account and API Key
  • Node.js 16+ and TypeScript support
  • Mastra installed in your project
  • API keys for your LLM providers and external services

Install Dependencies

# Core Mastra framework
npm install mastra

# OpenTelemetry packages for Node.js
npm install @opentelemetry/api @opentelemetry/sdk-node @opentelemetry/exporter-trace-otlp-http

# Additional OpenTelemetry instrumentation
npm install @opentelemetry/semantic-conventions @opentelemetry/resources

# LLM providers and tools (choose what you need)
npm install openai @anthropic-ai/sdk axios

Configure Orq.ai

Set up your environment variables to connect to Orq.ai’s OpenTelemetry collector: Unix/Linux/macOS:
export OTEL_EXPORTER_OTLP_ENDPOINT="https://api.orq.ai/v2/otel"
export OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer <ORQ_API_KEY>"
export OTEL_RESOURCE_ATTRIBUTES="service.name=mastra-app,service.version=1.0.0"
export OPENAI_API_KEY="<YOUR_OPENAI_API_KEY>"
Windows (PowerShell):
$env:OTEL_EXPORTER_OTLP_ENDPOINT = "https://api.orq.ai/v2/otel"
$env:OTEL_EXPORTER_OTLP_HEADERS = "Authorization=Bearer <ORQ_API_KEY>"
$env:OTEL_RESOURCE_ATTRIBUTES = "service.name=mastra-app,service.version=1.0.0"
$env:OPENAI_API_KEY = "<YOUR_OPENAI_API_KEY>"
Using .env file:
OTEL_EXPORTER_OTLP_ENDPOINT=https://api.orq.ai/v2/otel
OTEL_EXPORTER_OTLP_HEADERS=Authorization=Bearer <ORQ_API_KEY>
OTEL_RESOURCE_ATTRIBUTES=service.name=mastra-app,service.version=1.0.0
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>

Default Export

In your Mastra server configuration, enable export tracing. The previously set environment variables will be used.
TypeScript
export const mastra = new Mastra({
  // ... other config
  telemetry: {
    serviceName: "my-app",
    enabled: true,
    export: {
      type: "otlp",
      // endpoint and headers will be picked up from env vars
    },
  },
});

Custom Instrumentation

Mastra supports native OpenTelemetry integration for comprehensive observability. Create an instrumentation.mjs file in your Mastra project:
TypeScript
import { NodeSDK } from '@opentelemetry/sdk-node';
import { getNodeAutoInstrumentations } from '@opentelemetry/auto-instrumentations-node';
import { OTLPTraceExporter } from '@opentelemetry/exporter-trace-otlp-http';

const sdk = new NodeSDK({
  traceExporter: new OTLPTraceExporter({
    url: 'https://api.orq.ai/v2/otel/v1/traces',
    headers: {
      Authorization: "Bearer <ORQ_API_KEY>"
    }
  }),
  instrumentations: [getNodeAutoInstrumentations()],
});

sdk.start();
Enable Telemetry in your Mastra initialization:
TypeScript
export const mastra = new Mastra({
  telemetry: {
    enabled: true,
  },
});
All Mastra pipelines and agent calls will be instrumented and exported to Orq.ai through the OTLP exporter.

View Traces

View your traces in the AI Studio in the Traces tab.
Visit your AI Studio to view real-time analytics and traces.