diff --git a/docs/ai/quickstarts/quickstart-local-ai.md b/docs/ai/quickstarts/quickstart-local-ai.md index 5f16424ac1f7e..41abf32ca9c0c 100644 --- a/docs/ai/quickstarts/quickstart-local-ai.md +++ b/docs/ai/quickstarts/quickstart-local-ai.md @@ -28,6 +28,14 @@ Complete the following steps to configure and run a local AI Model on your devic ollama ``` + If Ollama is available, it displays a list of available commands. + +1. Start Ollama: + + ```bash + ollama serve + ``` + If Ollama is running, it displays a list of available commands. 1. Pull the `phi3:mini` model from the Ollama registry and wait for it to download: