diff --git a/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/README.md b/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/README.md
index 4de990fb5..6a83f51eb 100644
--- a/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/README.md
+++ b/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/README.md
@@ -1,4 +1,4 @@
-*Last Update: 23 November 2024*
+*Last Update: 27 November 2024*
Local LLM Inferencing and Interaction
Using the Ollama Open Source Tool

@@ -65,8 +65,7 @@ Create a VM in a public subnet following these guidelines:
10. Ensure that the VM is accessible via `ssh`
11. Configure the proxy setup if required (described below)
12. Update your local host's `/etc/hosts` file to reflect your public IP address for `llm-host`
-13. Should you have a proxy'd network follow the instruction in the "Proxy Settings" section below prior to performing the next step
-14. Perform an OS update in `llm-host` before proceeding:
+13. Perform an OS update in `llm-host` before proceeding:
```
sudo dnf update
@@ -267,7 +266,7 @@ ollama help serve
Download and test your first LLM (and you will notice the population of `/mnt/llm-repo` with data by running `ls -lR /mnt/llm-repo`):
-
+
Run some more tests from your client to test the APIs:
@@ -275,8 +274,8 @@ Run some more tests from your client to test the APIs:
$ curl http://llm-host:11434/api/tags
$ curl http://llm-host:11434/api/ps
$ curl -X POST http://llm-host:11434/api/generate -d '{
- "model": "llama3.2",
- "prompt":"Hello Llama3.2!",
+ "model": "mistral",
+ "prompt":"Hello Mistral!",
"stream": false
}'
```
@@ -294,8 +293,8 @@ Install any of the GUI clients mentioned previously and test the connectivity an
1. Create a Remote Models Provider
2. Name it appropriately
3. The Service Endpoint is `http://llm-host:11434`
-4. "Fetch Models" (that are already installed, in this case `llama3.2`)
-5. This step can be repeated as new models are added
+4. "Fetch Models" (that are already installed, in this case `mistral`)
+5. This step can be repeated as new models are added from the Ollama model repository
Example output as follows:
diff --git a/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/images/msty-example.png b/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/images/msty-example.png
index d83fafb24..9a09a2ecf 100644
Binary files a/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/images/msty-example.png and b/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/images/msty-example.png differ
diff --git a/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/images/ollama-curl-statements.png b/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/images/ollama-curl-statements.png
index 8811fe46c..49bc74cf1 100644
Binary files a/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/images/ollama-curl-statements.png and b/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/images/ollama-curl-statements.png differ
diff --git a/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/images/ollama-pull-and-test.png b/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/images/ollama-pull-and-test.png
index fd48f8eb4..b8a409712 100644
Binary files a/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/images/ollama-pull-and-test.png and b/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/images/ollama-pull-and-test.png differ
diff --git a/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/images/ollama-pull-model.png b/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/images/ollama-pull-model.png
index f667de2f1..8fe4bf223 100644
Binary files a/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/images/ollama-pull-model.png and b/cloud-infrastructure/private-cloud-and-edge/compute-cloud-at-customer/local-llm/images/ollama-pull-model.png differ