Skip to main content
The AI Router is the space where you can find all available LLM models for your work. orq.ai packages a number of LLMs you can search and enable for your work, you’re also able to onboard private models into the platform.
Only workspace admins have access to the AI Router page and can enable or disable models as needed.
AI Router Overview

Benefits of Using the AI Router

Use the latest models available

Using the AI Router lets you benefit from all model releases from various providers as orq.ai takes care of integrating them. When new models are available, you can find them in the AI Router and enable them for use, they will be immediately functional.

Unified API for all your models

Once a new model has been enabled, it can be immediately used within a Deployment. As your application is integrated with orq.ai, it doesn’t need any update as our APIs stay the same even after changing model. orq.ai ensures integration and readiness of any model available in the AI Router, letting you focus on your application.

Understanding the AI Router

Each model includes a detailed overview of its capabilities, performance, and configuration.
You can easily compare models across multiple columns, such as:
  • Provider and Model Type: to identify the source and intended use of the model
  • Modality: to see whether the model supports text, image, or both for input and output
  • Intelligence and Speed Ratings: to quickly assess performance tradeoffs
  • Token and Pricing Data: to understand input and output costs
  • Region and Release Date: to identify where and when models are available
These columns help you make informed decisions when selecting a model for your experiments or deployments. You can also directly access AI Router code snippets from each model card, allowing you to copy ready to use integration examples for immediate use in your application or deployment. AI Router code snippets