Skip to content

Commit 3a5fe31

Browse files
authored
Fix Ollama model env var in documentation (#3156)
Signed-off-by: Dina Suehiro Jones <dina.s.jones@intel.com>
1 parent bb6ecd9 commit 3a5fe31

File tree

6 files changed

+7
-7
lines changed

6 files changed

+7
-7
lines changed

python/samples/getting_started/agents/ollama/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -40,8 +40,8 @@ Set the following environment variables:
4040
- `OLLAMA_HOST`: The base URL for your Ollama server (optional, defaults to `http://localhost:11434`)
4141
- Example: `export OLLAMA_HOST="http://localhost:11434"`
4242

43-
- `OLLAMA_CHAT_MODEL_ID`: The model name to use
44-
- Example: `export OLLAMA_CHAT_MODEL_ID="qwen2.5:8b"`
43+
- `OLLAMA_MODEL_ID`: The model name to use
44+
- Example: `export OLLAMA_MODEL_ID="qwen2.5:8b"`
4545
- Must be a model you have pulled with Ollama
4646

4747
### For OpenAI Client with Ollama (`ollama_with_openai_chat_client.py`)

python/samples/getting_started/agents/ollama/ollama_agent_basic.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@
1212
1313
Ensure to install Ollama and have a model running locally before running the sample
1414
Not all Models support function calling, to test function calling try llama3.2 or qwen3:4b
15-
Set the model to use via the OLLAMA_CHAT_MODEL_ID environment variable or modify the code below.
15+
Set the model to use via the OLLAMA_MODEL_ID environment variable or modify the code below.
1616
https://ollama.com/
1717
1818
"""

python/samples/getting_started/agents/ollama/ollama_agent_reasoning.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@
1212
1313
Ensure to install Ollama and have a model running locally before running the sample
1414
Not all Models support reasoning, to test reasoning try qwen3:8b
15-
Set the model to use via the OLLAMA_CHAT_MODEL_ID environment variable or modify the code below.
15+
Set the model to use via the OLLAMA_MODEL_ID environment variable or modify the code below.
1616
https://ollama.com/
1717
1818
"""

python/samples/getting_started/agents/ollama/ollama_chat_client.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@
1212
1313
Ensure to install Ollama and have a model running locally before running the sample.
1414
Not all Models support function calling, to test function calling try llama3.2
15-
Set the model to use via the OLLAMA_CHAT_MODEL_ID environment variable or modify the code below.
15+
Set the model to use via the OLLAMA_MODEL_ID environment variable or modify the code below.
1616
https://ollama.com/
1717
1818
"""

python/samples/getting_started/agents/ollama/ollama_chat_multimodal.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@
1212
1313
Ensure to install Ollama and have a model running locally before running the sample
1414
Not all Models support multimodal input, to test multimodal input try gemma3:4b
15-
Set the model to use via the OLLAMA_CHAT_MODEL_ID environment variable or modify the code below.
15+
Set the model to use via the OLLAMA_MODEL_ID environment variable or modify the code below.
1616
https://ollama.com/
1717
1818
"""

python/samples/getting_started/chat_client/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -35,6 +35,6 @@ Depending on which client you're using, set the appropriate environment variable
3535

3636
**For Ollama client:**
3737
- `OLLAMA_HOST`: Your Ollama server URL (defaults to `http://localhost:11434` if not set)
38-
- `OLLAMA_CHAT_MODEL_ID`: The Ollama model to use for chat (e.g., `llama3.2`, `llama2`, `codellama`)
38+
- `OLLAMA_MODEL_ID`: The Ollama model to use for chat (e.g., `llama3.2`, `llama2`, `codellama`)
3939

4040
> **Note**: For Ollama, ensure you have Ollama installed and running locally with at least one model downloaded. Visit [https://ollama.com/](https://ollama.com/) for installation instructions.

0 commit comments

Comments
 (0)