Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.orq.ai/llms.txt

Use this file to discover all available pages before exploring further.

Use the following steps to add your LiteLLM integration to the AI Studio and import your existing models to the AI Router.

Setup your LiteLLM Instance

  • Open the AI Router page within the AI Studio.
  • Find the Providers tab and select LiteLLM.
    Find the Litellm Provider within the Providers Tab.
  • Choose Setup LiteLLM instance.
    Enter the Base URL and API Key for your instance.

Import Models

  • Switch to the Models tab in the AI Router.
  • Select Add Models and choose Import from LiteLLM
Adding Models using Import From LiteLLM.
  • Then select the models from the list of models imported from your LiteLLM provider.
Select Import to bring your LiteLLM model into the AI Router.

Model Imported

Your model is now usable within the AI Studio.