What are Annotations?
Annotations are structured key-value pairs attached to traces and spans in your observability data. They enable you to track quality metrics, collect user feedback, enable human review workflows, and build training datasets from annotated traces.Key Concepts
Human Reviews
Human Reviews define the schema for annotations:- Key (unique identifier)
- Value type and validation rules
- Available options (for select types)
Annotations
Annotations are structured key-value pairs that attach quality metrics, feedback, and evaluations to your traces and spans. Each annotation consists of:- Key: A unique identifier matching a Human Review definition
- Value: Data in the format defined by the Human Review (string, number, or string array)
- Metadata: Optional context like the identity of who submitted the annotation
Annotation Queues
Annotation Queues organize human review workflows in the AI Studio. They automatically filter and present relevant traces for review based on configurable rules. Queues enable teams to:- Systematically review traces matching specific criteria
- Apply annotations efficiently through the UI
- Collaborate on quality assurance processes
- Build curated datasets from annotated traces
Getting Started
Human Reviews
Create Human Review definitions that define annotation keys, value types, and validation rules. Required before adding annotations.
Annotations via the API
Add and remove annotations programmatically using the REST API or SDKs. Perfect for automated quality tracking and integration into your application code.
Annotation Queues in the AI Studio
Organize human review workflows with annotation queues in the UI. Efficiently review traces, apply annotations, and manage quality assurance processes.