diff --git a/03-Azure/01-01-App Innovation/01_AppServicetoContainerApps/walkthrough/challenge-5/solution.md b/03-Azure/01-01-App Innovation/01_AppServicetoContainerApps/walkthrough/challenge-5/solution.md index 42a9964cf..dfd12d0de 100644 --- a/03-Azure/01-01-App Innovation/01_AppServicetoContainerApps/walkthrough/challenge-5/solution.md +++ b/03-Azure/01-01-App Innovation/01_AppServicetoContainerApps/walkthrough/challenge-5/solution.md @@ -30,14 +30,13 @@ On the *Ingress* tab, enable the ingress and set it to `Accept traffic from anyw Now that the Container App is running, you still need to activate the llama 3.2 model. Open the Azure Shell: https://portal.azure.com/#cloudshell/ -You need to connect to the container to run commands inside the container itself. You can use that to modify the applications running in the container or to check logs. To connect to the container, you can use `az containerapp exec` command, rplace the name of the Container App or Resource Group with the ones you used, the command must be `/bin/sh` to run commands in the container: +You need to connect to the container to run commands inside the container itself. You can use that to modify the applications running in the container or to check logs. To connect to the container, you can use `az containerapp exec` command, replace the name of the Container App or Resource Group with the ones you used, the command must be `/bin/sh` to run commands in the container: -`az containerapp exec --name microhack-aiapp --resource-group MicroHack-AppServiceToContainerApp --command "/bin/sh" -ollama run llama3.2` +`az containerapp exec --name microhack-aiapp --resource-group MicroHack-AppServiceToContainerApp --command "/bin/sh"' You should see *INFO: Connection to the container 'microhack-aiapp'...* -After that, you can load and run the llama 3.2 model by running this command: `ollama run llama3.2` +After that, you can load and run the llama 3.2 model by running this command inside the container: `ollama run llama3.2` After the model is loaded, you can already chat with it: