Skip to main content
The AI Router app lets you find all available LLM models for your work. orq.ai packages a number of LLMs you can search and enable for your work, you’re also able to onboard private models into the platform.
Ai Router Home

AI Router Homepage

Benefits of Using the AI Router

Use the latest models available

Benefit from all model releases from various provider as the AI Router takes care of integrating them. When new models are available, you can find them in the AI Router and enable them for use, they will be immediately functional.

Unified API for all your models

Once a new model has been enabled it can be immediately used with the AI Router. As your application is integrated with our APIs, it doesn’t need any update as our APIs stay the same even after changing model. orq.ai ensures integration and readiness of any model available in the AI Router, letting you focus on your application by using it as an AI Gateway.

Understanding the Router

Each model includes a detailed overview of its capabilities, performance, and configuration.
You can easily compare models across multiple columns, such as:
  • Provider and Model Type — to identify the source and intended use of the model
  • Modality — to see whether the model supports text, image, or both for input and output
  • Intelligence and Speed Ratings — to quickly assess performance tradeoffs
  • Token and Pricing Data — to understand input and output costs
  • Region and Release Date — to identify where and when models are available
These columns help you make informed decisions when selecting a model for your experiments or deployments. You can also directly access AI Router code snippets from each model card, allowing you to copy ready to use integration examples for immediate use in your application or deployment.
Router Code Snippet

Find all code snippets ready to go to invoke your model through the AI Router.