This page describes features extending the AI Gateway, which provides a unified API for accessing multiple AI providers. To learn more, see AI Gateway.
Knowledge Bases are made to provide relevant and specific information for an LLM to use
Prerequisite
To get started, see Creating a Knowledge Base, Knowledge Bases need to be enriched with the sources documents and configured to expose chunks fitting your use case.The name of the Knowledge Base will be used as 
knowledge_id in the model generation.Quick Start
Using the created Knowledge Base, and itsid, include the knowledge_bases payload within your model generation call.
The knowledge_bases payload contains query configuration and search type, to learn more, see Retrieval Settings and Chunking Strategy.
Orq will automatically enrich the model generation with the given context and query to the knowledge base
Within the Traces, you can see the knowledge retrieval step Orq.ai automatically injects into your call to enrich the model context with the linked Knowledge Base.
