Getting Started
Instructor enables structured outputs from language models using Pydantic schemas. Tracing Instructor with Orq.ai provides comprehensive insights into data extraction patterns, validation success rates, retry mechanisms, and structured output performance to optimize your LLM-powered data processing pipelines.Prerequisites
Before you begin, ensure you have:- An Orq.ai account and API Key
- Python 3.8+
- Instructor library installed in your project
- OpenAI API key (or other supported LLM provider credentials)
Install Dependencies
Configure Orq.ai
Set up your environment variables to connect to Orq.ai’s OpenTelemetry collector: Unix/Linux/macOS:Integration
Instructor uses OpenInference instrumentation for automatic OpenTelemetry tracing.Set up the instrumentation in your application:
Use Instructor with automatic tracing:
All Instructor structured output extractions will be automatically instrumented and exported to Orq.ai through the OTLP exporter. For more details, see Traces.
Advanced Examples
**Complex Nested SchemasInstructor is also compatible with our AI Gateway, to learn more, see Instructor.
Haystack LangChain / LangGraph