A comprehensive collection of examples demonstrating PydanticAI framework capabilities, from basic model requests to advanced document processing with schema validation.
- Python 3.13+
uvpackage manager
# Install dependencies
uv sync
# Create .env file with your API key
echo "OPENAI_API_KEY=your-key-here" > .envNote: Most examples use OpenAI's GPT-5.1. Ensure your API key has appropriate permissions and sufficient quota.
Recommended order for learning PydanticAI:
- Direct Model Requests - Understand basic LLM API calls
- Temperature - Understand model parameters
- Reasoning Effort - Uncover how the reasoning effort may change the model's output
- Basic Sentiment - Learn structured outputs with Pydantic
- Dynamic Classification - Runtime schema generation
- Bielik - Local models and tools
- History Processor - Multi-turn conversations
- OCR Parsing - Complex real-world document processing
Location: direct_model_request/
Demonstrates direct model API calls without using Agents.
- Basic synchronous model requests using gpt-5.2
- When to use direct API vs Agents
- Simple text prompt handling
Location: temperature/
Control model behavior using temperature.
- Temperature effects on creativity vs consistency
- Practical configuration examples
- Use case recommendations
Location: reasoning_effort/
Demonstrates reasoning_effort parameter for gpt-5.2.
- Control depth of internal reasoning
- Complex problem-solving examples
- Trade-off between accuracy and latency
Location: basic_sentiment/
Fixed 3-class sentiment analysis (positive/negative/neutral) with structured outputs.
- Fixed Literal types defined at design time
- Simple accuracy evaluation
- Type-safe results with Pydantic validation
Location: dynamic_classification/
Runtime-adaptable classification that handles any number of classes dynamically.
- Dynamic Literal type generation using
create_model() - Same code handles binary, multi-class, and fine-grained classification
Location: bielik_example/
Learn to use Bielik, a Polish language LLM, with PydanticAI running locally via Ollama.
- Basic Inference: Simple agent setup and synchronous requests
- Tool Calling: Custom tools, multi-turn conversations, async operations
- Local model serving without cloud API dependencies
- Perfect for understanding agent architecture with a real model
Location: history_processor/
Learn how to manage conversation history in AI agents.
- Basic history handling and inspection
- Multi-turn conversations with context awareness
- History persistence (JSON and Database)
- Advanced filtering and transformation
- Context window management strategies (fixed, dynamic, and tool-aware)
- Production-ready database archival
Location: ocr_parsing/
Learn how to work with documents using PydanticAI for OCR (Optical Character Recognition).
- Basic OCR: Unstructured text extraction from PDFs
- Structured Output: Type-safe document analysis with schema validation
- Validation Errors: Error handling when LLM output doesn't match schema
- PDF to image conversion pipeline
- Parallel async processing with concurrency control
- Production-ready document processing patterns
# Install dependencies
uv sync
# Set API key
echo "OPENAI_API_KEY=your-key" > .env# Direct model requests
cd direct_model_request
uv run direct_request_demo.py
# Model parameters
cd temperature
uv run temperature_demo.py
# Reasoning effort
cd reasoning_effort
uv run reasoning_demo.py
# Basic sentiment classifier
cd basic_sentiment
uv run sentiment_classifier.py
# Dynamic classifier
cd dynamic_classification
uv run dynamic_classifier.py
# Bielik local model examples
# Note: Requires Ollama setup (see bielik_example/README.md)
cd bielik_example
uv run bielik_basic_inference.py
uv run bielik_basic_tools.py
# History processor - Run individual examples
cd history_processor
uv run 1_basic_history_handling.py
uv run 2_continuous_history.py
uv run 3_history_usage.py
uv run 4_history_filtering.py
uv run 5a_history_length_fixed.py
uv run 5b_history_length_dynamic.py
uv run 5c_history_with_tools.py
uv run 6_persistent_history.py
# OCR Parsing - Run examples in order
cd ocr_parsing
uv run 1_basic_ocr_demo.py
uv run 2_ocr_with_structured_output.py
uv run 3_ocr_validation.py # Uncomment validation line in code firstMost examples use PydanticAI's Agent class, which wraps an LLM with:
- System prompts to guide behavior
- Output type schemas for structured responses
- Async/await support for concurrent requests
It's worth noticing that since those are examples, most of them are pretty basic. However, it's easy to add an a tool for given agent. Let's look at **OCR Parsing code.
Currently the Agent does all the work itself - classifies document, parses the output, does the OCR and so on for every document in the same way. But what if we'd like to have a different behavior based on the document type?
from pydantic_ai import Agent, RunContext
from my_schemas import OCRInvoiceOutput, ReportOcrOutput
# The Agent acts as a router, deciding which tool to call
# based on the document's visual or textual cues.
agent = Agent(
'openai:gpt-5.1',
system_prompt="Analyze the document and use the appropriate tool for parsing."
)
@agent.tool
async def parse_invoice(ctx: RunContext[MyDeps], data: bytes) -> OCRInvoiceOutput:
"""Use this tool when the document is identified as an Invoice."""
# Your specialized OCR & validation logic here
return await ctx.deps.ocr_service.process(data, schema=OCRInvoiceOutput)
@agent.tool
async def parse_report(ctx: RunContext[MyDeps], data: bytes) -> ReportOcrOutput:
"""Use this tool when the document is a multi-page Annual Report."""
# Custom logic for complex reports
return await ctx.deps.ocr_service.process(data, schema=ReportOcrOutput)Examples show how to enforce type safety using Pydantic BaseModel:
- Basic classification:
Literaltypes - Dynamic classification:
create_model()for runtime schemas - OCR parsing: Complex nested schemas with validation
Several examples demonstrate async patterns:
- Parallel processing with
asyncio.gather() - Semaphore-based rate limiting
- Efficient handling of multiple documents
Learn how to manage conversational context:
- Persistent history storage
- Token-aware context windowing
- History filtering and transformation
Bielik example shows alternative to cloud APIs:
- Local model serving with Ollama
- Custom tool integration
- Same agent patterns as OpenAI models
├── direct_model_request/
│ ├── direct_request_demo.py
│ └── README.md
├── temperature/
│ ├── temperature_demo.py
│ └── README.md
├── reasoning_effort/
│ ├── reasoning_demo.py
│ └── README.md
├── basic_sentiment/
│ ├── sentiment_classifier.py
│ └── README.md
├── dynamic_classification/
│ ├── dynamic_classifier.py
│ └── README.md
├── bielik_example/
│ ├── bielik_basic_inference.py
│ ├── bielik_basic_tools.py
│ ├── Modelfile
│ ├── README.md
│ └── (optional) .env
├── history_processor/
│ ├── 1_basic_history_handling.py
│ ├── 2_continuous_history.py
│ ├── 3_history_usage.py
│ ├── 4_history_filtering.py
│ ├── 5a_history_length_fixed.py
│ ├── 5b_history_length_dynamic.py
│ ├── 5c_history_with_tools.py
│ ├── 6_persistent_history.py
│ ├── README.md
│ ├── output_3.json
├── ocr_parsing/
│ ├── 1_basic_ocr_demo.py
│ ├── 2_ocr_with_structured_output.py
│ ├── 3_ocr_validation.py
│ ├── README.md
│ ├── files/
│ │ ├── samples/ # Sample PDF documents
│ │ ├── temp_files/ # Temporary image files during processing
│ │ ├── results/ # Output JSON files
├── pyproject.toml
└── README.md- Ensure
OPENAI_API_KEYis set in.env - Verify key has appropriate permissions
- Check for rate limiting (503 errors)
- Run
uv syncto install all dependencies - Verify you're using Python 3.13+
- Some examples require async-compatible event loops
- On Windows, you may need to set event loop policy
- poppler not found: Install via your package manager (brew/apt/choco)
- PDF conversion fails: Ensure PDF is valid and readable
- Rate limiting: Reduce semaphore value in
ocr_parsing/shared_fns.py
See individual example READMEs for specific setup requirements.
- Python Documentation
- PydanticAI Documentation
- Pydantic Documentation
- OpenAI API Reference
- Python asyncio Guide
Found an issue or have an improvement? Feel free to contribute to this example repository.