Skip to main content

Prerequisite

To start using the Autogen SDK with the AI Router, you need an API Key ready in your account.
To setup your API key, see API keys & Endpoints.
To use libraries with private models, see Onboarding Private Models.

Using Orq.ai as AI Router

Using the Autogen SDK, set the Base URL to the AI Router to feed calls through our API without changing any other part of your code. Using the AI Router, you benefit from the Platform Traces, Cost and Usage Monitoring, keeping full compatibility and a unified API with all models while using the Autogen SDK.
base_url: https://api.orq.ai/v2/proxy api_key: Your Orq API key
from autogen import AssistantAgent, UserProxyAgent, config_list_from_json
import os

config_list = [
    {
        "api_key": os.getenv("ORQ_API_KEY"),
        "model": "openai/gpt-3.5-turbo", # Format the model `supplier/model`
        "base_url": "https://api.orq.ai/v2/proxy",
        "api_type": "openai",
    }
]

assistant = AssistantAgent("assistant", llm_config={"config_list": config_list})
user_proxy = UserProxyAgent("user_proxy", code_execution_config={"work_dir": "coding", "use_docker": False})
user_proxy.initiate_chat(assistant, message="You are a helpful assistant")