Setup your LiteLLM Instance
- Open the **AI Router **page.
-
Find the Providers tab and select LiteLLM.

Find the Litellm Provider within the Providers Tab.
-
Choose Setup LiteLLM instance.

Enter the Base URL and API Key for your instance.
Import Models
- Switch to the Models page in the AI Router
- Select Add Models and choose Import from LiteLLM

Adding Models using Import From LiteLLM.
- Then select the models from the list of models imported from your LiteLLM provider.

Select **Import** to bring your LiteLLM model within the AI Router.
Model Imported
Your model is now usable within the AI Router.