Skip to content

Commit c64881b

Browse files
authored
Merge pull request #67 from TamiTakamiya/TamiTakamiya/run-ansible-chatbot-stack-from-source
Run ansible-chatbot-stack from source
2 parents 7705df9 + 52d8aa7 commit c64881b

File tree

7 files changed

+182
-9
lines changed

7 files changed

+182
-9
lines changed

.gitignore

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -180,4 +180,5 @@ responses_store.db
180180
trace_store.db
181181
/embeddings_model
182182
/vector_db
183-
/llama-stack
183+
/llama-stack
184+
/work

Makefile

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -156,6 +156,7 @@ clean:
156156
@echo "Cleaning up your local folders..."
157157
rm -rf llama-stack/
158158
rm -rf providers.d/
159+
rm -rf work/
159160
@echo "Removing ansible-chatbot-stack images..."
160161
docker rmi -f $$(docker images -a -q --filter reference=ansible-chatbot-stack) || true
161162
@echo "Removing ansible-chatbot-stack containers..."

README.md

Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -165,3 +165,42 @@ If you have the need for re-building images, apply the following clean-ups right
165165
# Obtain a container shell for the Ansible Chatbot Stack.
166166
make shell
167167
```
168+
169+
## Appendix - Run from source (PyCharm)
170+
1. Clone the [lightspeed-core/lightspeed-stack](https://github.com/lightspeed-core/lightspeed-stack) repository to your development environment.
171+
2. In the ansible-chatbot-stack project root, create `.env` file in the project root and define following variables:
172+
```commandline
173+
PYTHONDONTWRITEBYTECODE=1
174+
PYTHONUNBUFFERED=1
175+
PYTHONCOERCECLOCALE=0
176+
PYTHONUTF8=1
177+
PYTHONIOENCODING=UTF-8
178+
LANG=en_US.UTF-8
179+
VLLM_URL=(VLLM URL Here)
180+
VLLM_API_TOKEN=(VLLM API Token Here)
181+
INFERENCE_MODEL=granite-3.3-8b-instruct
182+
183+
LIBRARY_CLIENT_CONFIG_PATH=./ansible-chatbot-run.yaml
184+
SYSTEM_PROMPT_PATH=./ansible-chatbot-system-prompt.txt
185+
EMBEDDINGS_MODEL=./embeddings_model
186+
VECTOR_DB_DIR=./vector_db
187+
PROVIDERS_DB_DIR=./work
188+
EXTERNAL_PROVIDERS_DIR=./llama-stack/providers.d
189+
```
190+
3. Create a Python run configuration with following values:
191+
- script/module: `script`
192+
- script path: `(lightspeed-stack project root)/src/lightspeed_stack.py`
193+
- arguments: `--config ./lightspeed-stack_local.yaml`
194+
- working directory: `(ansible-chatbot-stack project root)`
195+
- path to ".env" files: `(ansible-chatbot-stack project root)/.env`
196+
4. Run the created configuration from PyCharm main menu.
197+
198+
#### Note:
199+
If you want to debug codes in the `lightspeed-providers` project, you
200+
can add it as a local package dependency with:
201+
```commandline
202+
uv add --editable (lightspeed-providers project root)
203+
```
204+
It will update `pyproject.toml` and `uv.lock` files. Remember that
205+
they are for debugging purpose only and avoid checking in those local
206+
changes.

ansible-chatbot-run.yaml

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ providers:
2828
kvstore:
2929
type: sqlite
3030
namespace: null
31-
db_path: /.llama/data/distributions/ansible-chatbot/aap_faiss_store.db
31+
db_path: ${env.VECTOR_DB_DIR:/.llama/data/distributions/ansible-chatbot}/aap_faiss_store.db
3232
safety:
3333
- provider_id: llama-guard
3434
provider_type: inline::llama-guard
@@ -41,11 +41,11 @@ providers:
4141
persistence_store:
4242
type: sqlite
4343
namespace: null
44-
db_path: /.llama/data/distributions/ansible-chatbot/agents_store.db
44+
db_path: ${env.PROVIDERS_DB_DIR:/.llama/data/distributions/ansible-chatbot}/agents_store.db
4545
responses_store:
4646
type: sqlite
4747
namespace: null
48-
db_path: /.llama/data/distributions/ansible-chatbot/responses_store.db
48+
db_path: ${env.PROVIDERS_DB_DIR:/.llama/data/distributions/ansible-chatbot}/responses_store.db
4949
tools_filter:
5050
enabled: true
5151
model_id: ${env.INFERENCE_MODEL_FILTER:}
@@ -56,14 +56,14 @@ providers:
5656
kvstore:
5757
type: sqlite
5858
namespace: null
59-
db_path: /.llama/data/distributions/ansible-chatbot/localfs_datasetio.db
59+
db_path: ${env.PROVIDERS_DB_DIR:/.llama/data/distributions/ansible-chatbot}/localfs_datasetio.db
6060
telemetry:
6161
- provider_id: meta-reference
6262
provider_type: inline::meta-reference
6363
config:
6464
service_name: ${env.OTEL_SERVICE_NAME:}
6565
sinks: ${env.TELEMETRY_SINKS:console,sqlite}
66-
sqlite_db_path: /.llama/data/distributions/ansible-chatbot/trace_store.db
66+
sqlite_db_path: ${env.PROVIDERS_DB_DIR:/.llama/data/distributions/ansible-chatbot}/trace_store.db
6767
tool_runtime:
6868
- provider_id: rag-runtime
6969
provider_type: inline::rag-runtime
@@ -79,14 +79,14 @@ models:
7979
provider_model_id: null
8080
- metadata:
8181
embedding_dimension: 768
82-
model_id: /.llama/data/distributions/ansible-chatbot/embeddings_model
82+
model_id: ${env.EMBEDDINGS_MODEL:/.llama/data/distributions/ansible-chatbot/embeddings_model}
8383
provider_id: inline_sentence-transformer
8484
model_type: embedding
8585
shields: []
8686
vector_dbs:
8787
- metadata: {}
8888
vector_db_id: "aap-product-docs-2_5"
89-
embedding_model: /.llama/data/distributions/ansible-chatbot/embeddings_model
89+
embedding_model: ${env.EMBEDDINGS_MODEL:/.llama/data/distributions/ansible-chatbot/embeddings_model}
9090
embedding_dimension: 768
9191
provider_id: "aap_faiss"
9292
datasets: []
@@ -103,4 +103,4 @@ server:
103103
tls_cafile: null
104104
auth: null
105105
disable_ipv6: false
106-
external_providers_dir: /.llama/providers.d
106+
external_providers_dir: ${env.EXTERNAL_PROVIDERS_DIR:/.llama/providers.d}

lightspeed-stack_local.yaml

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
name: Ansible Lightspeed Intelligent Assistant
2+
service:
3+
host: 0.0.0.0
4+
port: 8080
5+
auth_enabled: false
6+
workers: 1
7+
color_log: true
8+
access_log: true
9+
llama_stack:
10+
use_as_library_client: true
11+
library_client_config_path: ./ansible-chatbot-run.yaml
12+
user_data_collection:
13+
feedback_disabled: true
14+
transcripts_disabled: true
15+
customization:
16+
system_prompt_path: ./ansible-chatbot-system-prompt.txt

pyproject.toml

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,3 +16,9 @@ dependencies = [
1616
"sentence-transformers>=5.0.0",
1717
"sqlalchemy~=2.0.41",
1818
]
19+
20+
[dependency-groups]
21+
dev = [
22+
"cachetools>=6.1.0",
23+
"kubernetes>=33.1.0",
24+
]

uv.lock

Lines changed: 110 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)