@@ -59,6 +59,8 @@ Docker compose is the easiest way, as you get one-step to:
5959* ingest data into elasticsearch
6060* run the app, which listens on http://localhost:4000
6161
62+ ** Double-check you have a ` .env ` file with all your variables set first!**
63+
6264``` bash
6365docker compose up --build --force-recreate
6466```
@@ -69,9 +71,10 @@ and retry.
6971### Run locally
7072
7173If you want to run this example with Python and Node.js, you need to do a few
72- things listed in the [ Dockerfile] ( Dockerfile ) .
74+ things listed in the [ Dockerfile] ( Dockerfile ) . The below uses the same
75+ production mode as used in Docker to avoid problems in debug mode.
7376
74- ** Make sure you have a ` .env ` file with all your variables**
77+ ** Double-check you have a ` .env ` file with all your variables set first! **
7578
7679#### Build the frontend
7780
@@ -123,3 +126,18 @@ $ dotenv run -- flask run
123126 * Serving Flask app ' api/app.py'
124127 * Debug mode: off
125128```
129+
130+ ## Customizing the app
131+
132+ ### Indexing your own data
133+
134+ The ingesting logic is stored in [ data/index_data.py] ( data/index_data.py ) . This
135+ is a simple script that uses Langchain to index data into Elasticsearch, using
136+ ` RecursiveCharacterTextSplitter ` to split the large JSON documents into
137+ passages. Modify this script to index your own data.
138+
139+ See [ Langchain documentation] [ loader-docs ] for more ways to load documents.
140+
141+
142+ ---
143+ [ loader-docs ] : https://python.langchain.com/docs/how_to/#document-loaders
0 commit comments