The AI Router exposes endpoints that are fully compatible with the OpenAI API, letting you use every model available in your Orq.ai AI Router without changing your application logic. Keep your existing OpenAI client, point it to the Orq.ai proxy baseURL, and continue as usual.Documentation Index
Fetch the complete documentation index at: https://docs.orq.ai/llms.txt
Use this file to discover all available pages before exploring further.
For the OpenAI API specification, see the API Reference
Drop-in Integration (No Code Changes)
- Keep your existing OpenAI SDK or HTTP integration.
- Set the base URL to
https://api.orq.ai/v3/router - Use your Orq.ai API Key in the Authorization header.
- Call the same endpoints and payloads you already use with OpenAI.
Base URL
OpenAI-compatible endpoint:Authentication
Authenticate with your Orq.ai API key via theAuthorization: Bearer $ORQ_API_KEY header.
To learn more about Orq API Key, see API Key.
Authorization: Bearer $ORQ_API_KEYContent-Type: application/json
Supported Endpoints
Schema, parameters, and response formats match the OpenAI API.(WIP) GET /models- List available models(WIP) GET /models/{model}- Get model detailsPOST /chat/completions- Chat completions (supports streaming, images, files, and tool calls)POST /completions- Text completionsPOST /embeddings- Vector embeddingsPOST /images/generations- Image generationPOST /images/edits- Image editingPOST /images/variations- Image variationsPOST /moderations- Text moderationPOST /rerank- Rerank resultsPOST /speech- Text-to-speechPOST /audio/transcriptions- Transcribe audio into the input languagePOST /audio/translations- Translate audio into the input languagePOST /responses- Create a model response with built-in tools (Web Search)
Models
Use the model field exactly as you would with OpenAI, substituting the ID of any model available in the AI Router Settings.Error Handling & Compatibility Notes
- HTTP status codes and error structures follow OpenAI’s conventions.
- Streaming, function/tool calls, and multimodal inputs (images/files) are supported on /chat/completions.
- The AI Gateway is versioned under
/v3/router. Ensure clients target this path.