Skip to main content
Logs are essential when monitoring your models in all environments within orq.ai. With logs, you can keep a clear overview of all your LLM calls, knowing when they are successful or failing. In this case, you are able to dive into problematic generations, understand the context and parameters used, reproduce and fix issues.

Finding Logs

Every LLM interaction generates a log you can look back at. This is true for Playground, Experiment, Deployment, and Agent. You can find logs in each of the respective sections within the Logs tab.

Inside the Logs

Overview

The logs overview lets you see a list ordered by recency of all the logs you can dive into. In this overview you can see the precise timestamp for each log and some details for the log itself, including status, provider, model, latency and cost.

Request

To learn more about the technical request being made, select the Request panel on the right of your log detail. Here you’ll be able to see all model configuration as well as the execution latency and cost of the generation.
By opening the Data panel, you can see the Variables and Context Attributes used for this call.

Feedback

You can provide feedback on the quality and accuracy of the generations made within the log.
To learn more, see Feedback.

Hyperlinking

When you want to copy the exact same LLM configuration (prompt, model, etc.) to another module within Orq, you can “hyperlink” it. The buttons highlighted in red enable you to:

Debug

You can view the log details as a JSON object and copy its payload within the Debug tab.

Limitations

Logs can contain a maximum of 16MB of data. This includes all text inputs, generations, retrievals and embedded images. Oversized logs are discarded from the Logs view.