Skip to content

Commit 275fa73

Browse files
authored
feat: support deployment using FastAPI (#37)
Signed-off-by: Ruichen Bao <ruichen.bao@zju.edu.cn>
1 parent c5d0329 commit 275fa73

File tree

5 files changed

+81
-0
lines changed

5 files changed

+81
-0
lines changed

README.md

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -209,6 +209,25 @@ More help information
209209
```shell
210210
deepsearcher --help
211211
```
212+
### Deployment
213+
214+
#### Configure modules
215+
216+
You can configure all arguments by modifying [config.py](./config.py) to set up your system with default modules.
217+
For example, prepare your `OPENAI_API_KEY` in `llm_api_key`.
218+
219+
#### Start service
220+
The main script will run a FastAPI service with default address `localhost:8000`.
221+
222+
```shell
223+
$ python main.py
224+
```
225+
226+
#### Access via browser
227+
228+
You can open url http://localhost:8000/docs in browser to access the web service.
229+
Click on the button "Try it out", it allows you to fill the parameters and directly interact with the API
230+
212231

213232
---
214233

config.py

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
from pydantic_settings import BaseSettings
2+
3+
class Settings(BaseSettings):
4+
llm_provider: str = "OpenAI"
5+
llm_model: str = "gpt-4o-mini"
6+
llm_api_key: str = "sk-xxxx"

main.py

Lines changed: 50 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,50 @@
1+
from fastapi import FastAPI, HTTPException
2+
from deepsearcher.configuration import Configuration, init_config
3+
from deepsearcher.offline_loading import load_from_local_files, load_from_website
4+
from deepsearcher.online_query import query
5+
import uvicorn
6+
from config import Settings
7+
8+
app = FastAPI()
9+
10+
settings = Settings()
11+
12+
config = Configuration()
13+
14+
config.set_provider_config(
15+
"llm",
16+
settings.llm_provider,
17+
{
18+
"model": settings.llm_model,
19+
"api_key": settings.llm_api_key
20+
}
21+
)
22+
23+
init_config(config)
24+
25+
@app.post("/load-files/")
26+
def load_files(paths: list[str], collection_name: str = None, collection_description: str = None):
27+
try:
28+
load_from_local_files(paths_or_directory=paths, collection_name=collection_name, collection_description=collection_description)
29+
return {"message": "Files loaded successfully."}
30+
except Exception as e:
31+
raise HTTPException(status_code=500, detail=str(e))
32+
33+
@app.post("/load-website/")
34+
def load_website(urls: str, collection_name: str = None, collection_description: str = None):
35+
try:
36+
load_from_website(urls=urls, collection_name=collection_name, collection_description=collection_description)
37+
return {"message": "Website loaded successfully."}
38+
except Exception as e:
39+
raise HTTPException(status_code=500, detail=str(e))
40+
41+
@app.get("/query/")
42+
def perform_query(original_query: str, max_iter: int = 3):
43+
try:
44+
result = query(original_query, max_iter)
45+
return {"result": result}
46+
except Exception as e:
47+
raise HTTPException(status_code=500, detail=str(e))
48+
49+
if __name__ == "__main__":
50+
uvicorn.run(app, host="0.0.0.0", port=8000)

requirements.txt

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,3 +7,6 @@ openai
77
numpy
88
tqdm
99
termcolor
10+
fastapi
11+
uvicorn
12+
pydantic-settings

setup.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,9 @@
1515
'numpy',
1616
'tqdm',
1717
'termcolor',
18+
'fastapi',
19+
'uvicorn',
20+
'pydantic-settings'
1821
],
1922
entry_points={
2023
'console_scripts': ['deepsearcher=deepsearcher.cli:main'],

0 commit comments

Comments
 (0)