AI Router
Overview
Microsoft Semantic Kernel is an SDK that integrates Large Language Models (LLMs) with conventional programming languages. By connecting Semantic Kernel to Orq.ai’s AI Router, you transform experimental AI agents into production-ready systems with enterprise-grade capabilities.Key Benefits
Orq.ai’s AI Router enhances your Semantic Kernel applications with:Complete Observability
Track every agent step, tool use, and interaction with detailed traces and analytics
Built-in Reliability
Automatic fallbacks, retries, and load balancing for production resilience
Cost Optimization
Real-time cost tracking and spend management across all your AI operations
Multi-Provider Access
Access 300+ LLMs and 20+ providers through a single, unified integration
Prerequisites
Before integrating Semantic Kernel with Orq.ai, ensure you have:- An Orq.ai account and API Key
- Python 3.8 or higher
- Semantic Kernel SDK installed
To setup your API key, see API keys & Endpoints.
Installation
Install Semantic Kernel and the OpenAI SDK:Configuration
Configure Semantic Kernel to use Orq.ai’s AI Router by creating an OpenAI client with a custom base URL:
base_url: https://api.orq.ai/v2/router
Basic Example
Here’s a complete example of using Semantic Kernel with Orq.ai:Using Plugins (Functions)
Semantic Kernel’s power comes from combining LLMs with plugins. Here’s how to use them with Orq.ai:Model Selection
With Orq.ai, you can use any supported model from 20+ providers:Streaming Responses
Semantic Kernel supports streaming with Orq.ai:Observability & Monitoring
All Semantic Kernel interactions routed through Orq.ai are automatically tracked and available in the AI Studio:- Request Traces: View complete conversation flows and function calls
- Plugin Usage: Monitor which plugins are being invoked and their success rates
- Performance Metrics: Track latency, token usage, and completion rates
- Cost Analysis: Understand spending patterns across models and providers