You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -37,7 +37,7 @@ We built PydanticAI with one simple aim: to bring that FastAPI feeling to GenAI
37
37
Built by the team behind [Pydantic](https://docs.pydantic.dev/latest/) (the validation layer of the OpenAI SDK, the Anthropic SDK, LangChain, LlamaIndex, AutoGPT, Transformers, CrewAI, Instructor and many more).
38
38
39
39
*__Model-agnostic__
40
-
Supports OpenAI, Anthropic, Gemini, Ollama, Groq, and Mistral, and there is a simple interface to implement support for [other models](https://ai.pydantic.dev/models/).
40
+
Supports OpenAI, Anthropic, Gemini, Deepseek, Ollama, Groq, Cohere, and Mistral, and there is a simple interface to implement support for [other models](https://ai.pydantic.dev/models/).
41
41
42
42
*__Pydantic Logfire Integration__
43
43
Seamlessly [integrates](https://ai.pydantic.dev/logfire/) with [Pydantic Logfire](https://pydantic.dev/logfire) for real-time debugging, performance monitoring, and behavior tracking of your LLM-powered applications.
Copy file name to clipboardExpand all lines: docs/index.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -17,7 +17,7 @@ We built PydanticAI with one simple aim: to bring that FastAPI feeling to GenAI
17
17
Built by the team behind [Pydantic](https://docs.pydantic.dev/latest/) (the validation layer of the OpenAI SDK, the Anthropic SDK, LangChain, LlamaIndex, AutoGPT, Transformers, CrewAI, Instructor and many more).
Supports OpenAI, Anthropic, Gemini, Ollama, Groq, and Mistral, and there is a simple interface to implement support for [other models](models.md).
20
+
Supports OpenAI, Anthropic, Gemini, Deepseek, Ollama, Groq, Cohere, and Mistral, and there is a simple interface to implement support for [other models](models.md).
Seamlessly [integrates](logfire.md) with [Pydantic Logfire](https://pydantic.dev/logfire) for real-time debugging, performance monitoring, and behavior tracking of your LLM-powered applications.
Copy file name to clipboardExpand all lines: docs/models.md
+76-21Lines changed: 76 additions & 21 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,10 +4,11 @@ PydanticAI is Model-agnostic and has built in support for the following model pr
4
4
*[Anthropic](#anthropic)
5
5
* Gemini via two different APIs: [Generative Language API](#gemini) and [VertexAI API](#gemini-via-vertexai)
6
6
*[Ollama](#ollama)
7
+
*[Deepseek](#deepseek)
7
8
*[Groq](#groq)
8
9
*[Mistral](#mistral)
9
10
10
-
See [OpenAI-compatible models](#openai-compatible-models) for more examples on how to use models such as [OpenRouter](#openrouter), [Grok (xAI)](#grok-xai) and [DeepSeek](#deepseek) that support the OpenAI SDK.
11
+
See [OpenAI-compatible models](#openai-compatible-models) for more examples on how to use models such as [OpenRouter](#openrouter), and [Grok (xAI)](#grok-xai) that support the OpenAI SDK.
11
12
12
13
You can also [add support for other models](#implementing-custom-models).
13
14
@@ -304,26 +305,6 @@ agent = Agent(model)
304
305
305
306
[`VertexAiRegion`][pydantic_ai.models.vertexai.VertexAiRegion] contains a list of available regions.
306
307
307
-
## Ollama
308
-
309
-
### Install
310
-
311
-
To use [`OllamaModel`][pydantic_ai.models.ollama.OllamaModel], you need to either install [`pydantic-ai`](install.md), or install [`pydantic-ai-slim`](install.md#slim-install) with the `openai` optional group:
312
-
313
-
```bash
314
-
pip/uv-add 'pydantic-ai-slim[openai]'
315
-
```
316
-
317
-
**This is because internally, `OllamaModel` uses the OpenAI API.**
318
-
319
-
### Configuration
320
-
321
-
To use [Ollama](https://ollama.com/), you must first download the Ollama client, and then download a model using the [Ollama model library](https://ollama.com/library).
322
-
323
-
You must also ensure the Ollama server is running when trying to make requests to it. For more information, please see the [Ollama documentation](https://github.com/ollama/ollama/tree/main/docs).
324
-
325
-
For detailed setup and example, please see the [Ollama setup documentation](https://github.com/pydantic/pydantic-ai/blob/main/docs/api/models/ollama.md).
326
-
327
308
## Groq
328
309
329
310
### Install
@@ -456,6 +437,80 @@ model = OpenAIModel(
456
437
...
457
438
```
458
439
440
+
### Ollama
441
+
442
+
To use [Ollama](https://ollama.com/), you must first download the Ollama client, and then download a model using the [Ollama model library](https://ollama.com/library).
443
+
444
+
You must also ensure the Ollama server is running when trying to make requests to it. For more information, please see the [Ollama documentation](https://github.com/ollama/ollama/tree/main/docs).
445
+
446
+
#### Example local usage
447
+
448
+
With `ollama` installed, you can run the server with the model you want to use:
449
+
450
+
```bash {title="terminal-run-ollama"}
451
+
ollama run llama3.2
452
+
```
453
+
(this will pull the `llama3.2` model if you don't already have it downloaded)
0 commit comments