Skip to content
Merged
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 6 additions & 1 deletion docs/ai/quickstarts/quickstart-local-ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,12 @@ Complete the following steps to configure and run a local AI Model on your devic
ollama
```

If Ollama is running, it displays a list of available commands.
If Ollama is available, it displays a list of available commands.
1. Start Ollama:

```bash
ollama serve
```

1. Pull the `phi3:mini` model from the Ollama registry and wait for it to download:

Expand Down
Loading