Instructor Installation Guide: Setup for LLM Structured Outputs¶
Learn how to install Instructor, the leading Python library for extracting structured data from LLMs like GPT-4, Claude, and Gemini. This comprehensive installation tutorial covers all major LLM providers and gets you ready for production use.
Quick Start: Install Instructor for LLM Development¶
Get started with structured LLM outputs in seconds. Install Instructor using pip:
Instructor leverages Pydantic for type-safe LLM data extraction:
Pro Tip: Use
uv
for faster installation:uv pip install instructor
LLM Provider Installation Guide¶
Instructor supports 15+ LLM providers. Here's how to install and configure each:
OpenAI (GPT-4, GPT-3.5)¶
OpenAI is the default LLM provider for Instructor. Perfect for GPT-4 and GPT-3.5-turbo structured outputs:
Configure your OpenAI API key for LLM access:
Anthropic Claude LLM Setup¶
Extract structured data from Claude 3 models (Opus, Sonnet, Haiku) with native tool support:
Configure Claude API access:
Google Gemini LLM Integration¶
Use Gemini Pro and Flash models for structured outputs with function calling:
Set up Gemini API access:
Cohere¶
To use with Cohere's models:
Set up your Cohere API key:
Mistral¶
To use with Mistral AI's models:
Set up your Mistral API key:
LiteLLM (Multiple Providers)¶
To use LiteLLM for accessing multiple providers:
Set up API keys for the providers you want to use.
Verify Your Instructor LLM Setup¶
Test your Instructor installation with this simple LLM structured output example:
import instructor
from openai import OpenAI
from pydantic import BaseModel
class Person(BaseModel):
name: str
age: int
client = instructor.from_openai(OpenAI())
person = client.chat.completions.create(
model="gpt-3.5-turbo",
response_model=Person,
messages=[
{"role": "user", "content": "John Doe is 30 years old"}
]
)
print(f"Name: {person.name}, Age: {person.age}")
Next Steps in Your LLM Tutorial Journey¶
With Instructor installed, you're ready to build powerful LLM applications:
- Create Your First LLM Extraction - Build structured outputs with any LLM
- Master Response Models - Learn Pydantic models for LLM data validation
- Configure LLM Clients - Set up OpenAI, Anthropic, Google, and more
Common Installation Issues¶
- Import Errors: Ensure you've installed the provider-specific extras (e.g.,
instructor[anthropic]
) - API Key Issues: Verify your environment variables are set correctly
- Version Conflicts: Use
pip install --upgrade instructor
to get the latest version
Ready to extract structured data from LLMs? Continue to Your First Extraction →