Skip to main content
Use the following steps to add your LiteLLM integration to the Orq.ai Studio and import your existing models to the AI Router.

Setup your LiteLLM Instance

  • Open the **AI Router **page.
  • Find the Providers tab and select LiteLLM.
  • Choose Setup LiteLLM instance.

Import Models

  • Switch to the Models page in the AI Router
  • Select Add Models and choose Import from LiteLLM
  • Then select the models from the list of models imported from your LiteLLM provider.

Model Imported

Your model is now usable within the AI Router.