OpenAI-Compatible API

📖

This page describes features extending the AI Proxy, which provides a unified API for accessing multiple AI providers. To learn more, see AI Proxy.

The Orq.ai proxy exposes endpoints that are fully compatible with the OpenAI API, letting you use every model available in your Orq.ai Model Garden without changing your application logic. Keep your existing OpenAI client, point it to the Orq.ai proxy baseURL, and continue as usual.

📘

For the OpenAI API specification, see the API Reference

👍

Drop-in Integration (No Code Changes)

  1. Keep your existing OpenAI SDK or HTTP integration.
  2. Set the base URL to https://api.orq.ai/v2/proxy
  3. Use your Orq.ai API Key in the Authorization header.
  4. Call the same endpoints and payloads you already use with OpenAI.

Base URL

OpenAI-compatible proxy endpoint:

https://api.orq.ai/v2/proxy

All routes below are relative to this base URL and mirror OpenAI’s request/response formats.

Authentication

Authenticate with your Orq.ai API key via the Authorization: Bearer $ORQ_API_KEY header.

📘

To learn more about Orq API Key, see API Key.

Minimum headers:

  • Authorization: Bearer $ORQ_API_KEY
  • Content-Type: application/json

Supported Endpoints

Schema, parameters, and response formats match the OpenAI API.

  • (WIP) GET /models - List available models
  • (WIP) GET /models/{model} - Get model details
  • POST /chat/completions - Chat completions (supports streaming, images, files, and tool calls)
  • POST /completions - Text completions
  • POST /embeddings - Vector embeddings
  • POST /images/generations - Image generation
  • POST /images/edits - Image editing
  • POST /images/variations - Image variations
  • POST /moderations - Text moderation
  • POST /rerank - Rerank results
  • POST /speech - Text-to-speech
  • POST /audio/transcriptions - Transcribe audio into the input language
  • POST /audio/translations - Translate audio into the input language
  • POST /responses - Create a model response with built-in tools

Models

Use the model field exactly as you would with OpenAI, substituting the ID of any model available in your Orq.ai Model Garden.

📘

To list your models via the API, see Using the Model Garden via the API.

Error Handling & Compatibility Notes

  • HTTP status codes and error structures follow OpenAI’s conventions.
  • Streaming, function/tool calls, and multimodal inputs (images/files) are supported on /chat/completions.
  • The proxy is versioned under /v2/proxy. Ensure clients target this path.