Skip to content

Commit 5e9d142

Browse files
authored
Add support for local OpenAI-compatible server, fix local hot reloading, add docs on local dev (#1148)
* Add support for local server and local dev docs * Add test
1 parent 2a0e9d1 commit 5e9d142

File tree

5 files changed

+90
-0
lines changed

5 files changed

+90
-0
lines changed

README.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -276,6 +276,8 @@ You can only run locally **after** having successfully run the `azd up` command.
276276
2. Change dir to `app`
277277
3. Run `./start.ps1` or `./start.sh` or run the "VS Code Task: Start App" to start the project locally.
278278

279+
See more tips in [the local development guide](docs/local.md).
280+
279281
## Using the app
280282

281283
* In Azure: navigate to the Azure WebApp deployed by azd. The URL is printed out when azd completes (as "Endpoint"), or you can find it in the Azure portal.

app/backend/app.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -299,6 +299,8 @@ async def setup_clients():
299299
azure_endpoint=f"https://{AZURE_OPENAI_SERVICE}.openai.azure.com",
300300
azure_ad_token_provider=token_provider,
301301
)
302+
elif OPENAI_HOST == "local":
303+
openai_client = AsyncOpenAI(base_url=os.environ["OPENAI_BASE_URL"], api_key="no-key-required")
302304
else:
303305
openai_client = AsyncOpenAI(
304306
api_key=OPENAI_API_KEY,

app/frontend/vite.config.ts

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -27,6 +27,7 @@ export default defineConfig({
2727
proxy: {
2828
"/content/": "http://localhost:50505",
2929
"/auth_setup": "http://localhost:50505",
30+
"/.auth/me": "http://localhost:50505",
3031
"/ask": "http://localhost:50505",
3132
"/chat": "http://localhost:50505",
3233
"/config": "http://localhost:50505"

docs/localdev.md

Lines changed: 67 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,67 @@
1+
# Local development of Chat App
2+
3+
You can only run locally **after** having successfully run the `azd up` command. If you haven't yet, follow the steps in [Azure deployment](../README.md#azure-deployment) above.
4+
5+
1. Run `azd auth login`
6+
2. Change dir to `app`
7+
3. Run `./start.ps1` or `./start.sh` or run the "VS Code Task: Start App" to start the project locally.
8+
9+
## Hot reloading frontend and backend files
10+
11+
When you run `./start.ps1` or `./start.sh`, the backend files will be watched and reloaded automatically. However, the frontend files will not be watched and reloaded automatically.
12+
13+
To enable hot reloading of frontend files, open a new terminal and navigate to the frontend directory:
14+
15+
```shell
16+
cd app/frontend
17+
```
18+
19+
Then run:
20+
21+
```shell
22+
npm run dev
23+
```
24+
25+
You should see:
26+
27+
```shell
28+
29+
> vite
30+
31+
32+
VITE v4.5.1 ready in 957 ms
33+
34+
➜ Local: http://localhost:5173/
35+
➜ Network: use --host to expose
36+
➜ press h to show help
37+
```
38+
39+
Navigate to the URL shown in the terminal (in this case, `http://localhost:5173/`). This local server will watch and reload frontend files. All backend requests will be routed to the Python server according to `vite.config.ts`.
40+
41+
Then, whenever you make changes to frontend files, the changes will be automatically reloaded, without any browser refresh needed.
42+
43+
44+
## Using a local OpenAI-compatible API
45+
46+
You may want to save costs by developing against a local LLM server, such as
47+
[llamafile](https://github.com/Mozilla-Ocho/llamafile/). Note that a local LLM
48+
will generally be slower and not as sophisticated.
49+
50+
Once you've got your local LLM running and serving an OpenAI-compatible endpoint, set these environment variables:
51+
52+
```shell
53+
azd env set OPENAI_HOST local
54+
azd env set OPENAI_BASE_URL <your local endpoint>
55+
```
56+
57+
For example, to point at a local llamafile server running on its default port:
58+
59+
```shell
60+
azd env set OPENAI_BASE_URL http://localhost:8080/v1
61+
```
62+
63+
If you're running inside a dev container, use this local URL instead:
64+
65+
```shell
66+
azd env set OPENAI_BASE_URL http://host.docker.internal:8080/v1
67+
```

tests/test_app.py

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -54,6 +54,24 @@ async def test_missing_env_vars():
5454
test_app.test_client()
5555

5656

57+
@pytest.mark.asyncio
58+
async def test_app_local_openai(monkeypatch):
59+
with mock.patch.dict(os.environ, clear=True):
60+
monkeypatch.setenv("AZURE_STORAGE_ACCOUNT", "test-storage-account")
61+
monkeypatch.setenv("AZURE_STORAGE_CONTAINER", "test-storage-container")
62+
monkeypatch.setenv("AZURE_SEARCH_INDEX", "test-search-index")
63+
monkeypatch.setenv("AZURE_SEARCH_SERVICE", "test-search-service")
64+
monkeypatch.setenv("AZURE_OPENAI_CHATGPT_MODEL", "gpt-35-turbo")
65+
os.environ["OPENAI_HOST"] = "local"
66+
os.environ["OPENAI_BASE_URL"] = "http://localhost:5000"
67+
68+
quart_app = app.create_app()
69+
70+
async with quart_app.test_app():
71+
assert quart_app.config[app.CONFIG_OPENAI_CLIENT].api_key == "no-key-required"
72+
assert quart_app.config[app.CONFIG_OPENAI_CLIENT].base_url == "http://localhost:5000"
73+
74+
5775
@pytest.mark.asyncio
5876
async def test_index(client):
5977
response = await client.get("/")

0 commit comments

Comments
 (0)