You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Ollama Python library provides the easiest way to integrate your Python 3 project with [Ollama](https://github.com/jmorganca/ollama).
3
+
The Ollama Python library provides the easiest way to integrate Python 3.8+ projects with [Ollama](https://github.com/jmorganca/ollama).
4
4
5
-
## Getting Started
6
-
7
-
Requires Python 3.8 or higher.
5
+
## Install
8
6
9
7
```sh
10
8
pip install ollama
11
9
```
12
10
13
-
A global default client is provided for convenience and can be used in the same way as the synchronous client.
11
+
## Usage
14
12
15
13
```python
16
14
import ollama
17
-
response = ollama.chat(model='llama2', messages=[{'role': 'user', 'content': 'Why is the sky blue?'}])
15
+
response = ollama.chat(model='llama2', messages=[
16
+
{
17
+
'role': 'user',
18
+
'content': 'Why is the sky blue?',
19
+
},
20
+
])
21
+
print(response['message']['content'])
18
22
```
19
23
24
+
## Streaming responses
25
+
26
+
Response streaming can be enabled by setting `stream=True`, modifying function calls to return a Python generator where each part is an object in the stream.
27
+
20
28
```python
21
29
import ollama
22
-
message = {'role': 'user', 'content': 'Why is the sky blue?'}
23
-
for part in ollama.chat(model='llama2', messages=[message], stream=True):
Response streaming can be enabled by setting `stream=True`. This modifies the function to return a Python generator where each part is an object in the stream.
92
+
### Pull
93
+
94
+
```python
95
+
ollama.pull('llama2')
96
+
```
97
+
98
+
### Push
99
+
100
+
```python
101
+
ollama.push('user/llama2')
102
+
```
103
+
104
+
### Embeddings
105
+
106
+
```python
107
+
ollama.embeddings(model='llama2', prompt='They sky is blue because of rayleigh scattering')
108
+
```
109
+
110
+
## Custom client
111
+
112
+
A custom client can be created with the following fields:
113
+
114
+
-`host`: The Ollama host to connect to
115
+
-`timeout`: The timeout for requests
37
116
38
117
```python
39
118
from ollama import Client
40
-
message = {'role': 'user', 'content': 'Why is the sky blue?'}
41
-
for part in Client().chat(model='llama2', messages=[message], stream=True):
0 commit comments