Orq.ai as Prompt Manager

Orq.ai can be used as configuration management so that you keep control over your LLM calls.

Configuration Management

You can decide to use orq.ai only as configuration management for your various LLM backend.

This has some advantages over using AI Gateway:

  • You manage the calls to LLM Models end-to-end, this lets you keep control over the integration and manage its lifecycle, ensuring data stays within your infrastructure before reaching LLM backends.
  • You still benefit from the configuration management on orq.ai side and can fetch at runtime the latest configuration from your Deployment.
  • You still benefit from Deployment Routing, ensuring your users reach the model you desire, using dynamic Context Attributes.

Using get_config

Our API and SDK offer a way to invoke a Deployment but also a way to fetch its Configuration: get_config

📘

To lean more about get_config, see its API Reference.

By using this method you will benefit from Deployment Routing as well all the configurations stored within the Deployment. You can then use this object to call any LLM provider directly from your application.


Example Call

The following is an example call using our SDKs, the call is similar to the invoke call. Its difference is that it returns a configuration object and doesn't execute calls to LLM providers.

config = client.deployments.get_config(
  "key" = "Deployment-configuration",
  "context" ={ environments: [ "production" ], locale: [ "en" ]},
  "inputs" ={ country: "Netherlands" },
  "metadata"= {custom-field-name":"custom-metadata-value"}
)

print(config.to_dict())
const deploymentConfig = await client.deployments.getConfig({
   key: "Deployment-configuration",
   context: { environments: [ "production"  ], locale: [  "en"  ] },
   inputs: { country: "Netherlands" },
   metadata: { "custom-field-name": "custom-metadata-value" }
}););

key: The deployment to invoke.

inputs: The key-value pair of variables to replace in your prompts. Default variables are used if not provided.

context: This key-value pair that match your data model and fields declared in your Variant Routing Configuration matrix.

metadata: This key-value pairs that you want to attach to the log generated by this request.

🚧

When using get_config, you won't benefit from retries and fallbacks as your calls to LLM providers will be made outside of our Platform.


Metrics & Logging

By using the id returned from your get_config call, you can also add logs back to your deployment and benefit from the Dashboard.

To do so, use the add_metrics call (see Add metrics. add_metrics lets you track various metrics, including but not limited to chain ID, conversation ID, user ID, feedback (scores), custom metadata, and performance-related statistics.