Skip to content

Commit 47d4c00

Browse files
authored
Merge pull request #2443 from chrismoroney/cmoroney-run-llm-chatbot-with-pytorch-using-kleidiai-reviewed-10-2025
Pytorch-llama - solution to localhost not connecting
2 parents 80b9f8e + 8aa2626 commit 47d4c00

File tree

1 file changed

+12
-0
lines changed

1 file changed

+12
-0
lines changed

content/learning-paths/servers-and-cloud-computing/pytorch-llama/pytorch-llama-frontend.md

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -74,3 +74,15 @@ Collecting usage statistics. To deactivate, set browser.gatherUsageStats to fals
7474
Open the local URL from the link above in a browser and you should see the chatbot running:
7575

7676
![Chatbot](images/chatbot.png)
77+
78+
{{% notice Note %}}
79+
If you are running a server in the cloud, the local URL may not connect when starting the frontend server. If this happens, stop the frontend server and reconnect to your instance using port forwarding (see code below). After reconnecting, activate the `venv` and start the Streamlit frontend server.
80+
81+
```sh
82+
# Replace with your .pem file and machine's public IP
83+
ssh -i /path/to/your/key.pem -L 8501:localhost:8501 ubuntu@<your-ec2-public-ip>
84+
source torch_env/bin/activate
85+
cd torchchat
86+
streamlit run browser/browser.py
87+
```
88+
{{% /notice %}}

0 commit comments

Comments
 (0)