Skip to main content

What are Annotations?

Annotations are structured key-value pairs attached to traces and spans in your observability data. They enable you to track quality metrics, collect user feedback, enable human review workflows, and build training datasets from annotated traces.

Key Concepts

Human Reviews

Human Reviews define the schema for annotations:
  • Key (unique identifier)
  • Value type and validation rules
  • Available options (for select types)

Annotations

Annotations are structured key-value pairs that attach quality metrics, feedback, and evaluations to your traces and spans. Each annotation consists of:
  • Key: A unique identifier matching a Human Review definition
  • Value: Data in the format defined by the Human Review (string, number, or string array)
  • Metadata: Optional context like the identity of who submitted the annotation
Annotations persist on spans in your observability data and enable quality tracking, human-in-the-loop workflows, and dataset curation.

Annotation Queues

Annotation Queues organize human review workflows in the AI Studio. They automatically filter and present relevant traces for review based on configurable rules. Queues enable teams to:
  • Systematically review traces matching specific criteria
  • Apply annotations efficiently through the UI
  • Collaborate on quality assurance processes
  • Build curated datasets from annotated traces

Getting Started

Human Reviews

Create Human Review definitions that define annotation keys, value types, and validation rules. Required before adding annotations.