Skip to content

Commit 8aa2626

Browse files
committed
solution to localhost not connecting
1 parent b962a17 commit 8aa2626

File tree

1 file changed

+12
-0
lines changed

1 file changed

+12
-0
lines changed

content/learning-paths/servers-and-cloud-computing/pytorch-llama/pytorch-llama-frontend.md

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -74,3 +74,15 @@ Collecting usage statistics. To deactivate, set browser.gatherUsageStats to fals
7474
Open the local URL from the link above in a browser and you should see the chatbot running:
7575

7676
![Chatbot](images/chatbot.png)
77+
78+
{{% notice Note %}}
79+
If you are running a server in the cloud, the local URL may not connect when starting the frontend server. If this happens, stop the frontend server and reconnect to your instance using port forwarding (see code below). After reconnecting, activate the `venv` and start the Streamlit frontend server.
80+
81+
```sh
82+
# Replace with your .pem file and machine's public IP
83+
ssh -i /path/to/your/key.pem -L 8501:localhost:8501 ubuntu@<your-ec2-public-ip>
84+
source torch_env/bin/activate
85+
cd torchchat
86+
streamlit run browser/browser.py
87+
```
88+
{{% /notice %}}

0 commit comments

Comments
 (0)