Realtime Logs & Events

Logs & Events are key to improve the quality of your models, with realtime access to all the generation data you can continuously improve your environments.

Logs & Events are essential when monitoring your models in all environments within orq.ai.

With logs, you can keep a clear overview of all your LLM calls, knowing when they are successful or failing. In this case, you are able to dive into problematic generations, understand the context and parameters used, and reproduce and fix issues.

Where to find logs

Every LLM interaction generates a log you can look back at. This is true for Playgrounds, Experiments and Deployments. You can find logs in each of the respective sections within the Logs tab.

Inside the Logs

Overview

The logs overview lets you see a list ordered by recency of all the logs you can dive into. In this overview you can see the precise timestamp for each log and some details for the log itself, including status, provider, model, latency and cost.

This overview screen lets you scan for the logs you want to dive into, choosing a log to see details for is just a matter of clicking on a log line.

This overview screen lets you scan for the logs you want to dive into, choosing a log to see details for is just a matter of clicking on a log line.

Request

To learn more about the technical request being made, select the Request panel on the right of your log detail.

Here you'll be able to see all model configuration as well as the execution latency and cost of the generation.

The **Request** tab lets you dive into the technical details of your model during this generation.

The request tab lets you dive into the technical details of your model during this generation.

By opening the Data panel, you can see the Inputs and Context Attributes used for this call.

The **Data** tab lets you confirm which inputs and metadata were used during this generation.

The Data tab lets you confirm which inputs and metadata were used during this generation.

Feedback

You can provide feedback on the quality and accuracy of the generations made within the log.

πŸ“˜

To learn more, see Feedbacks.

Hyperlinking

When you want to copy the exact same LLM configuration (prompt, model, etc.) to another module within Orq, you can "hyperlink" it. The buttons highlighted in red enable you to:

All actions are available at the top-right of the panel.

Hyperlinking buttons