What are Annotations?
Annotations are structured key-value pairs that capture human feedback on traces and spans in your observability data. They enable you to collect human insights, apply quality assessments, enable human review workflows, and build training datasets from human-reviewed traces.Key Concepts
Human Reviews
Human Reviews define the schema for annotations:- Key (unique identifier)
- Value type and validation rules
- Available options (for select types)
Annotations
Annotations are structured key-value pairs that capture human feedback on your traces and spans. Each annotation consists of:- Key: A unique identifier matching a Human Review definition
- Value: Data in the format defined by the Human Review (string, number, or string array)
- Metadata: Optional context like the identity of who submitted the annotation
Annotation Queues
Annotation Queues organize human review workflows in the AI Studio. They automatically filter and present relevant traces for review based on configurable rules. Queues enable teams to:- Systematically review traces matching specific criteria
- Apply annotations efficiently through the UI
- Collaborate on quality assurance processes
- Build curated datasets from annotated traces
Getting Started
Human Reviews
Create Human Review definitions that define annotation keys, value types, and validation rules. Required before adding annotations.
Annotations in the AI Studio
Apply human feedback and corrections to traces and logs directly in the AI Studio. Review AI responses and capture quality assessments through the UI.
Annotations via the API
Add and remove human feedback programmatically using the REST API or SDKs. Perfect for capturing user feedback and integrating human review into your application.
Annotation Queues in the AI Studio
Organize human review workflows with annotation queues in the UI. Efficiently review traces, apply human feedback, and manage quality assurance processes.