Orq MCP is live: Use natural language to interrogate traces, spot regressions, and experiment your way to optimal AI configurations. Available in Claude Desktop, Claude Code, Cursor, and more. Start now →
SmolAgents Integration | AI Router & Observability
Connect SmolAgents to Orq.ai’s AI Router for access to 300+ LLMs, and send traces via OpenTelemetry for complete observability.
AI Router
Route your LLM calls through the AI Router with a single base URL change. Zero vendor lock-in: always run on the best model at the lowest cost for your use case.
Observability
Instrument your code with OpenTelemetry to capture traces, logs, and metrics for every LLM call, agent step, and tool use.
SmolAgents is a lightweight Python agent framework by Hugging Face. Using OpenAIServerModel with a custom api_base, you can route all LLM calls through Orq.ai’s AI Router for access to 300+ models, cost tracking, and reliability features.
import osfrom smolagents import CodeAgent, OpenAIServerModelmodel = OpenAIServerModel( model_id="openai/gpt-4o-mini", api_base="https://api.orq.ai/v2/router", api_key=os.environ["ORQ_API_KEY"],)agent = CodeAgent(tools=[], model=model)agent.run("What is the capital of France?")
SmolagentsInstrumentor must be called before any agent is instantiated. Once instrumented, all CodeAgent and ToolCallingAgent runs are automatically traced.