Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -30,14 +30,13 @@ On the *Ingress* tab, enable the ingress and set it to `Accept traffic from anyw

Now that the Container App is running, you still need to activate the llama 3.2 model. Open the Azure Shell: https://portal.azure.com/#cloudshell/

You need to connect to the container to run commands inside the container itself. You can use that to modify the applications running in the container or to check logs. To connect to the container, you can use `az containerapp exec` command, rplace the name of the Container App or Resource Group with the ones you used, the command must be `/bin/sh` to run commands in the container:
You need to connect to the container to run commands inside the container itself. You can use that to modify the applications running in the container or to check logs. To connect to the container, you can use `az containerapp exec` command, replace the name of the Container App or Resource Group with the ones you used, the command must be `/bin/sh` to run commands in the container:

`az containerapp exec --name microhack-aiapp --resource-group MicroHack-AppServiceToContainerApp --command "/bin/sh"
ollama run llama3.2`
`az containerapp exec --name microhack-aiapp --resource-group MicroHack-AppServiceToContainerApp --command "/bin/sh"'

You should see *INFO: Connection to the container 'microhack-aiapp'...*

After that, you can load and run the llama 3.2 model by running this command: `ollama run llama3.2`
After that, you can load and run the llama 3.2 model by running this command inside the container: `ollama run llama3.2`

After the model is loaded, you can already chat with it:

Expand Down