Model Garden

Orq.ai's Model Garden enables you to seamlessly run your AI use cases across different LLM Providers and their supported models.

The model garden is a repository of pre-trained and fine-tuned large language models (LLMs) that product teams use to build applications. It provides a convenient way for teams to access the power of LLMs without training, hosting, monitoring, and operating their models from scratch.

Orq.ai's model garden includes various LLMs, each with strengths and weaknesses. This allows product teams to choose the right LLM for their specific needs. Orq.ai relies on inference providers and doesn't host its own models.

Supported LLM Providers

Orq.ai supports a long list of LLM providers and models to provide developers with the best service for prompt engineering and LLM Ops.

Orq.ai is LLM-agnostic. Therefore, we want to get out of your way of working with public, private, and custom LLM Providers and Models. Within the Model Garden, you can enable providers and specific models your product works with.

For a more detailed overview of all the supported LLM providers and models available in orq.ai's model garden, you can check it out here.

Activating a Model

The steps for activating a model are straightforward. Go to Model Garden and toggle the models your teams want to use across your workspace.

AI Gateway support

The AI gateway is an abstraction layer that integrates different LLMs using orq.ai. You can find all the models supported in the AI Gateway in the model garden; when you select these models, you can create Deployments that go straight to the model without additional API calls.

The models supported in the AI Gateway can be used as primary and fallback models in Deployments.

Orq.ai handles all integrations with the LLM providers and ensures up-to-date capabilities.

Benefits of the Model Garden

  • Convenience: The model gardens provide a convenient way for developers to access the power of LLMs without having to train their models from scratch.
  • Choice: It typically includes various LLMs, each with its strengths and weaknesses, allowing developers to choose the right LLM for their specific needs.