Skip to content

Commit e03b12b

Browse files
authored
doc: provide a workable guideline update for ollama user (bytedance#323)
1 parent 8823ffd commit e03b12b

File tree

1 file changed

+4
-3
lines changed

1 file changed

+4
-3
lines changed

docs/configuration_guide.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -61,12 +61,13 @@ BASIC_MODEL:
6161
### How to use Ollama models?
6262
6363
DeerFlow supports the integration of Ollama models. You can refer to [litellm Ollama](https://docs.litellm.ai/docs/providers/ollama). <br>
64-
The following is a configuration example of `conf.yaml` for using Ollama models:
64+
The following is a configuration example of `conf.yaml` for using Ollama models(you might need to run the 'ollama serve' first):
6565

6666
```yaml
6767
BASIC_MODEL:
68-
model: "ollama/ollama-model-name"
69-
base_url: "http://localhost:11434" # Local service address of Ollama, which can be started/viewed via ollama serve
68+
model: "model-name" # Model name, which supports the completions API(important), such as: qwen3:8b, mistral-small3.1:24b, qwen2.5:3b
69+
base_url: "http://localhost:11434/v1" # Local service address of Ollama, which can be started/viewed via ollama serve
70+
api_key: "whatever" # Mandatory, fake api_key with a random string you like :-)
7071
```
7172

7273
### How to use OpenRouter models?

0 commit comments

Comments
 (0)