Skip to main content
This page describes features extending the AI Gateway, which provides a unified API for accessing multiple AI providers. To learn more, see AI Gateway.
Knowledge Bases are made to provide relevant and specific information for an LLM to use

Prerequisite

To get started, see Creating a Knowledge Base, Knowledge Bases need to be enriched with the sources documents and configured to expose chunks fitting your use case.
The name of the Knowledge Base will be used as knowledge_id in the model generation.

Quick Start

Using the created Knowledge Base, and its id, include the knowledge_bases payload within your model generation call.
The knowledge_bases payload contains query configuration and search type, to learn more, see Retrieval Settings and Chunking Strategy.
curl -X POST https://api.orq.ai/v2/proxy/chat/completions \
  -H "Authorization: Bearer $ORQ_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
  "model": "openai/gpt-4o",
  "messages": [
    {
      "role": "user",
      "content": "How can I upgrade my account?"
    }
  ],
  "orq": {
    "knowledge_bases": [
      {
        "knowledge_id": "api-documentation",
        "top_k": 5,
        "threshold": 0.7,
        "search_type": "hybrid_search"
      }
    ]
  }
}' 
Orq will automatically enrich the model generation with the given context and query to the knowledge base

Within the Traces, you can see the knowledge retrieval step Orq.ai automatically injects into your call to enrich the model context with the linked Knowledge Base.