Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/examples/ag-ui.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Demonstrates:

## Running the Example

With [dependencies installed and environment variables set](./index.md#usage)
With [dependencies installed and environment variables set](./setup.md#usage)
you will need two command line windows.

### Pydantic AI AG-UI backend
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/bank-support.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Demonstrates:

## Running the Example

With [dependencies installed and environment variables set](./index.md#usage), run:
With [dependencies installed and environment variables set](./setup.md#usage), run:

```bash
python/uv-run -m pydantic_ai_examples.bank_support
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/chat-app.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ and `chat_app.ts` which renders messages in the browser.

## Running the Example

With [dependencies installed and environment variables set](./index.md#usage), run:
With [dependencies installed and environment variables set](./setup.md#usage), run:

```bash
python/uv-run -m pydantic_ai_examples.chat_app
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/data-analyst.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ Demonstrates:

## Running the Example

With [dependencies installed and environment variables set](./index.md#usage), run:
With [dependencies installed and environment variables set](./setup.md#usage), run:

```bash
python/uv-run -m pydantic_ai_examples.data_analyst
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/flight-booking.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ graph TD

## Running the Example

With [dependencies installed and environment variables set](./index.md#usage), run:
With [dependencies installed and environment variables set](./setup.md#usage), run:

```bash
python/uv-run -m pydantic_ai_examples.flight_booking
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/pydantic-model.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Demonstrates:

## Running the Example

With [dependencies installed and environment variables set](./index.md#usage), run:
With [dependencies installed and environment variables set](./setup.md#usage), run:

```bash
python/uv-run -m pydantic_ai_examples.pydantic_model
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/question-graph.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Demonstrates:

## Running the Example

With [dependencies installed and environment variables set](./index.md#usage), run:
With [dependencies installed and environment variables set](./setup.md#usage), run:

```bash
python/uv-run -m pydantic_ai_examples.question_graph
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/rag.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ docker run --rm \
As with the [SQL gen](./sql-gen.md) example, we run postgres on port `54320` to avoid conflicts with any other postgres instances you may have running.
We also mount the PostgreSQL `data` directory locally to persist the data if you need to stop and restart the container.

With that running and [dependencies installed and environment variables set](./index.md#usage), we can build the search database with (**WARNING**: this requires the `OPENAI_API_KEY` env variable and will calling the OpenAI embedding API around 300 times to generate embeddings for each section of the documentation):
With that running and [dependencies installed and environment variables set](./setup.md#usage), we can build the search database with (**WARNING**: this requires the `OPENAI_API_KEY` env variable and will calling the OpenAI embedding API around 300 times to generate embeddings for each section of the documentation):

```bash
python/uv-run -m pydantic_ai_examples.rag build
Expand Down
4 changes: 2 additions & 2 deletions docs/examples/index.md → docs/examples/setup.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Examples

Examples of how to use Pydantic AI and what it can do.
Here we include some examples of how to use Pydantic AI and what it can do.

## Usage

Expand All @@ -20,7 +20,7 @@ If you clone the repo, you should instead use `uv sync --extra examples` to inst

### Setting model environment variables

These examples will need you to set up authentication with one or more of the LLMs, see the [model configuration](../models/index.md) docs for details on how to do this.
These examples will need you to set up authentication with one or more of the LLMs, see the [model configuration](../models/overview.md) docs for details on how to do this.

TL;DR: in most cases you'll need to set one of the following environment variables:

Expand Down
2 changes: 1 addition & 1 deletion docs/examples/slack-lead-qualifier.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ You need to have a Slack workspace and the necessary permissions to create apps.

## Usage

1. Make sure you have the [dependencies installed](./index.md#usage).
1. Make sure you have the [dependencies installed](./setup.md#usage).

2. Authenticate with Modal:

Expand Down
2 changes: 1 addition & 1 deletion docs/examples/sql-gen.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ docker run --rm -e POSTGRES_PASSWORD=postgres -p 54320:5432 postgres

_(we run postgres on port `54320` to avoid conflicts with any other postgres instances you may have running)_

With [dependencies installed and environment variables set](./index.md#usage), run:
With [dependencies installed and environment variables set](./setup.md#usage), run:

```bash
python/uv-run -m pydantic_ai_examples.sql_gen
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/stream-markdown.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Demonstrates:

## Running the Example

With [dependencies installed and environment variables set](./index.md#usage), run:
With [dependencies installed and environment variables set](./setup.md#usage), run:

```bash
python/uv-run -m pydantic_ai_examples.stream_markdown
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/stream-whales.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ and displays it as a dynamic table using [`rich`](https://github.com/Textualize/

## Running the Example

With [dependencies installed and environment variables set](./index.md#usage), run:
With [dependencies installed and environment variables set](./setup.md#usage), run:

```bash
python/uv-run -m pydantic_ai_examples.stream_whales
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/weather-agent.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ To run this example properly, you might want to add two extra API keys **(Note i
- A weather API key from [tomorrow.io](https://www.tomorrow.io/weather-api/) set via `WEATHER_API_KEY`
- A geocoding API key from [geocode.maps.co](https://geocode.maps.co/) set via `GEO_API_KEY`

With [dependencies installed and environment variables set](./index.md#usage), run:
With [dependencies installed and environment variables set](./setup.md#usage), run:

```bash
python/uv-run -m pydantic_ai_examples.weather_agent
Expand Down
4 changes: 2 additions & 2 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ We built Pydantic AI with one simple aim: to bring that FastAPI feeling to GenAI
Built by the team behind [Pydantic Validation](https://docs.pydantic.dev/latest/) (the validation layer of the OpenAI SDK, the Anthropic SDK, LangChain, LlamaIndex, AutoGPT, Transformers, CrewAI, Instructor and many more).

- **Model-agnostic**:
Supports OpenAI, Anthropic, Gemini, Deepseek, Ollama, Groq, Cohere, and Mistral, and there is a simple interface to implement support for [other models](models/index.md).
Supports OpenAI, Anthropic, Gemini, Deepseek, Ollama, Groq, Cohere, and Mistral, and there is a simple interface to implement support for [other models](models/overview.md).

- **Pydantic Logfire Integration**:
Seamlessly [integrates](logfire.md) with [Pydantic Logfire](https://pydantic.dev/logfire) for real-time debugging, performance monitoring, and behavior tracking of your LLM-powered applications.
Expand Down Expand Up @@ -244,7 +244,7 @@ file.

## Next Steps

To try Pydantic AI yourself, follow the instructions [in the examples](examples/index.md).
To try Pydantic AI yourself, follow the instructions [in the examples](examples/setup.md).

Read the [docs](agents.md) to learn more about building applications with Pydantic AI.

Expand Down
4 changes: 2 additions & 2 deletions docs/install.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ To install examples, use the `examples` optional group:
pip/uv-add "pydantic-ai[examples]"
```

To run the examples, follow instructions in the [examples docs](examples/index.md).
To run the examples, follow instructions in the [examples docs](examples/setup.md).

## Slim Install

Expand Down Expand Up @@ -58,7 +58,7 @@ pip/uv-add "pydantic-ai-slim[openai]"
* `a2a` - installs `fasta2a` [PyPI ↗](https://pypi.org/project/fasta2a){:target="_blank"}
* `ag-ui` - installs `ag-ui-protocol` [PyPI ↗](https://pypi.org/project/ag-ui-protocol){:target="_blank"} and `starlette` [PyPI ↗](https://pypi.org/project/starlette){:target="_blank"}

See the [models](models/index.md) documentation for information on which optional dependencies are required for each model.
See the [models](models/overview.md) documentation for information on which optional dependencies are required for each model.

You can also install dependencies for multiple models and use cases, for example:

Expand Down
File renamed without changes.
File renamed without changes.
4 changes: 2 additions & 2 deletions docs/temporal.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ Activity code faces no restrictions on I/O or external interactions, but if an a

See the [Temporal documentation](https://docs.temporal.io/evaluate/understanding-temporal#temporal-application-the-building-blocks) for more information

In the case of Pydantic AI agents, integration with Temporal means that [model requests](models/index.md), [tool calls](tools.md) that may require I/O, and [MCP server communication](mcp/client.md) all need to be offloaded to Temporal activities due to their I/O requirements, while the logic that coordinates them (i.e. the agent run) lives in the workflow. Code that handles a scheduled job or web request can then execute the workflow, which will in turn execute the activities as needed.
In the case of Pydantic AI agents, integration with Temporal means that [model requests](models/overview.md), [tool calls](tools.md) that may require I/O, and [MCP server communication](mcp/client.md) all need to be offloaded to Temporal activities due to their I/O requirements, while the logic that coordinates them (i.e. the agent run) lives in the workflow. Code that handles a scheduled job or web request can then execute the workflow, which will in turn execute the activities as needed.

The diagram below shows the overall architecture of an agentic application in Temporal.
The Temporal Server is responsible for tracking program execution and making sure the associated state is preserved reliably (i.e., stored to an internal database, and possibly replicated across cloud regions).
Expand Down Expand Up @@ -71,7 +71,7 @@ See the [Temporal documentation](https://docs.temporal.io/evaluate/understanding

Any agent can be wrapped in a [`TemporalAgent`][pydantic_ai.durable_exec.temporal.TemporalAgent] to get a durable agent that can be used inside a deterministic Temporal workflow, by automatically offloading all work that requires I/O (namely model requests, tool calls, and MCP server communication) to non-deterministic activities.

At the time of wrapping, the agent's [model](models/index.md) and [toolsets](toolsets.md) (including function tools registered on the agent and MCP servers) are frozen, activities are dynamically created for each, and the original model and toolsets are wrapped to call on the worker to execute the corresponding activities instead of directly performing the actions inside the workflow. The original agent can still be used as normal outside the Temporal workflow, but any changes to its model or toolsets after wrapping will not be reflected in the durable agent.
At the time of wrapping, the agent's [model](models/overview.md) and [toolsets](toolsets.md) (including function tools registered on the agent and MCP servers) are frozen, activities are dynamically created for each, and the original model and toolsets are wrapped to call on the worker to execute the corresponding activities instead of directly performing the actions inside the workflow. The original agent can still be used as normal outside the Temporal workflow, but any changes to its model or toolsets after wrapping will not be reflected in the durable agent.

Here is a simple but complete example of wrapping an agent for durable execution, creating a Temporal workflow with durable execution logic, connecting to a Temporal server, and running the workflow from non-durable code. All it requires is a Temporal server to be [running locally](https://github.com/temporalio/temporal#download-and-start-temporal-server-locally):

Expand Down
2 changes: 1 addition & 1 deletion docs/troubleshooting.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ Note: This fix also applies to Google Colab and [Marimo](https://github.com/mari

### `UserError: API key must be provided or set in the [MODEL]_API_KEY environment variable`

If you're running into issues with setting the API key for your model, visit the [Models](models/index.md) page to learn more about how to set an environment variable and/or pass in an `api_key` argument.
If you're running into issues with setting the API key for your model, visit the [Models](models/overview.md) page to learn more about how to set an environment variable and/or pass in an `api_key` argument.

## Monitoring HTTPX Requests

Expand Down
Loading