|
1 | 1 | # Local development of Chat App
|
2 | 2 |
|
| 3 | +After deploying the app to Azure, you may want to continue development locally. This guide explains how to run the app locally, including hot reloading and debugging. |
| 4 | + |
| 5 | +* [Running development server from the command line](#running-development-server-from-the-command-line) |
| 6 | +* [Hot reloading frontend and backend files](#hot-reloading-frontend-and-backend-files) |
| 7 | +* [Using VS Code "Run and Debug"](#using-vs-code-run-and-debug) |
| 8 | +* [Using a local OpenAI-compatible API](#using-a-local-openai-compatible-api) |
| 9 | + * [Using Ollama server](#using-ollama-server) |
| 10 | + * [Using llamafile server](#using-llamafile-server) |
| 11 | + |
| 12 | +## Running development server from the command line |
| 13 | + |
3 | 14 | You can only run locally **after** having successfully run the `azd up` command. If you haven't yet, follow the steps in [Azure deployment](../README.md#azure-deployment) above.
|
4 | 15 |
|
5 | 16 | 1. Run `azd auth login`
|
@@ -40,6 +51,16 @@ Navigate to the URL shown in the terminal (in this case, `http://localhost:5173/
|
40 | 51 |
|
41 | 52 | Then, whenever you make changes to frontend files, the changes will be automatically reloaded, without any browser refresh needed.
|
42 | 53 |
|
| 54 | +## Using VS Code "Run and Debug" |
| 55 | + |
| 56 | +This project includes configurations defined in `.vscode/launch.json` that allow you to run and debug the app directly from VS Code: |
| 57 | + |
| 58 | +* "Backend (Python)": Starts the Python backend server, defaulting to port 50505. |
| 59 | +* "Frontend": Starts the frontend server using Vite, typically at port 5173. |
| 60 | +* "Frontend & Backend": A compound configuration that starts both the frontend and backend servers. |
| 61 | + |
| 62 | +When you run these configurations, you can set breakpoints in your code and debug as you would in a normal VS Code debugging session. |
| 63 | + |
43 | 64 | ## Using a local OpenAI-compatible API
|
44 | 65 |
|
45 | 66 | You may want to save costs by developing against a local LLM server, such as
|
|
0 commit comments