Skip to content

Commit dfe48e3

Browse files
committed
Trying to rebase with main and commit the files again
Signed-off-by: Vineeth Kalluru <[email protected]> # Conflicts: # industries/asset_lifecycle_management_agent/README.md # industries/asset_lifecycle_management_agent/configs/config-reasoning.yml
1 parent 1adeb6d commit dfe48e3

File tree

8 files changed

+1183
-342
lines changed

8 files changed

+1183
-342
lines changed
Lines changed: 108 additions & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -1,46 +1,122 @@
1-
# macOS system files
1+
# Misc
2+
config_examples.yml
3+
env.sh
4+
frontend/
5+
prompts.md
6+
7+
# Python
8+
__pycache__/
9+
*.py[cod]
10+
*$py.class
11+
*.so
12+
.Python
13+
*.egg
14+
*.egg-info/
15+
dist/
16+
build/
17+
*.whl
18+
pip-wheel-metadata/
219
.DS_Store
3-
.DS_Store?
4-
._*
5-
.Spotlight-V100
6-
.Trashes
7-
ehthumbs.db
8-
Thumbs.db
920

10-
# Database and vector store files
11-
database/
12-
*.db
13-
*.sqlite3
21+
# Virtual environments
22+
.venv/
23+
venv/
24+
ENV/
25+
env/
26+
27+
# IDEs and Editors
28+
.vscode/
29+
.idea/
30+
*.swp
31+
*.swo
32+
*~
33+
.DS_Store
34+
35+
# Testing
36+
.pytest_cache/
37+
.coverage
38+
htmlcov/
39+
.tox/
40+
.hypothesis/
41+
42+
# Jupyter Notebook
43+
.ipynb_checkpoints/
44+
*.ipynb_checkpoints/
1445

15-
# Output and generated files
46+
# Output and Data Directories
1647
output_data/
17-
moment/
18-
readmes/
19-
*.html
20-
*.csv
21-
*.npy
48+
eval_output/
49+
example_eval_output/
50+
output/
51+
results/
52+
logs/
2253

23-
# Python package metadata
24-
src/**/*.egg-info/
25-
*.egg-info/
54+
# Database files
55+
*.db
56+
*.sqlite
57+
*.sqlite3
58+
database/*.db
59+
database/*.sqlite
2660

27-
# Environment files (if they contain secrets)
28-
env.sh
61+
# Vector store data (ChromaDB)
62+
database/
63+
chroma_db/
64+
vector_store/
65+
vanna_vector_store/
2966

30-
# Model files (if large/binary)
67+
# Model files (large binary files)
3168
models/*.pkl
32-
models/*.joblib
33-
models/*.model
69+
models/*.h5
70+
models/*.pt
71+
models/*.pth
72+
models/*.ckpt
73+
*.pkl
74+
*.h5
75+
*.pt
76+
*.pth
77+
moment/
3478

35-
# Logs
36-
*.log
37-
logs/
79+
# Data files (CSV, JSON, etc. - be selective)
80+
*.csv
81+
*.json
82+
!training_data.json
83+
!vanna_training_data.yaml
84+
!config*.json
85+
!config*.yaml
86+
!config*.yml
87+
!pyproject.toml
88+
!package.json
89+
90+
# Frontend build artifacts
91+
frontend/node_modules/
92+
frontend/dist/
93+
frontend/build/
94+
frontend/.next/
95+
frontend/out/
96+
97+
# Environment and secrets
98+
.env
99+
.env.local
100+
.env.*.local
101+
*.secret
102+
secrets/
103+
credentials/
38104

39105
# Temporary files
40106
*.tmp
41107
*.temp
42-
.pytest_cache/
43-
__pycache__/
108+
*.log
109+
*.cache
110+
111+
# OS specific
112+
Thumbs.db
113+
Desktop.ini
114+
115+
# Experiment tracking
116+
mlruns/
117+
wandb/
44118

45-
# dot env
46-
mydot.env
119+
# Documentation builds
120+
docs/_build/
121+
docs/.doctrees/
122+
site/

industries/asset_lifecycle_management_agent/README.md

Lines changed: 21 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -175,6 +175,21 @@ Now install the ALM workflow:
175175
uv pip install -e .
176176
```
177177
178+
#### Installation Options
179+
180+
**Base Installation** (default - includes ChromaDB + SQLite):
181+
```bash
182+
uv pip install -e .
183+
```
184+
185+
**Optional Database Support:**
186+
- PostgreSQL: `uv pip install -e ".[postgres]"`
187+
- MySQL: `uv pip install -e ".[mysql]"`
188+
- All databases: `uv pip install -e ".[all-databases]"`
189+
190+
**Optional Vector Store:**
191+
- Elasticsearch: `uv pip install -e ".[elasticsearch]"`
192+
178193
### [Optional] Verify if all prerequisite packages are installed
179194
```bash
180195
uv pip list | grep -E "nvidia-nat|nvidia-nat-ragaai|nvidia-nat-phoenix|vanna|chromadb|xgboost|pytest|torch|matplotlib"
@@ -463,7 +478,9 @@ def your_custom_utility(file_path: str, param: int = 100) -> str:
463478
4. **Consistent Interface**: All utilities return descriptive success messages
464479
5. **Documentation**: Use `utils.show_utilities()` to discover available functions
465480
466-
### Setup Web Interface
481+
### Alternative: Generic NeMo-Agent-Toolkit UI
482+
483+
If you prefer the generic NeMo Agent Toolkit UI instead of our custom interface:
467484
468485
```bash
469486
git clone https://github.com/NVIDIA/NeMo-Agent-Toolkit-UI.git
@@ -509,7 +526,7 @@ Retrieve and detect anomalies in sensor 4 measurements for engine number 78 in t
509526
510527
**Workspace Utilities Demo**
511528
```
512-
Retrieve ground truth RUL values and time in cycles from FD001 train dataset. Apply piecewise RUL transformation with MAXLIFE=100. Finally, Plot a line chart of the transformed values across time.
529+
Retrieve RUL values and time in cycles for engine unit 24 from FD001 train dataset. Use the piece wise RUL transformation code utility to perform piecewise RUL transformation on the ground truth RUL values with MAXLIFE=100.Finally, Plot a comparison line chart with RUL values and its transformed values across time.
513530
```
514531
515532
*This example demonstrates how to discover and use workspace utilities directly. The system will show available utilities and then apply the RUL transformation using the pre-built, reliable utility functions.*
@@ -518,9 +535,9 @@ Retrieve ground truth RUL values and time in cycles from FD001 train dataset. Ap
518535
```
519536
Perform the following steps:
520537
521-
1.Retrieve the time in cycles, all sensor measurements, and ground truth RUL values for engine unit 24 from FD001 train dataset.
538+
1.Retrieve the time in cycles, all sensor measurements, and ground truth RUL values, partition by unit number for engine unit 24 from FD001 train dataset.
522539
2.Use the retrieved data to predict the Remaining Useful Life (RUL).
523-
3.Use the piece wise RUL transformation code utility to apply piecewise RUL transformation only to the observed RUL column.
540+
3.Use the piece wise RUL transformation code utility to apply piecewise RUL transformation only to the observed RUL column with MAXLIFE of 100.
524541
4.Generate a plot that compares the transformed RUL values and the predicted RUL values across time.
525542
```
526543
![Prediction Example](imgs/test_prompt_3.png)

industries/asset_lifecycle_management_agent/configs/config-reasoning.yml

Lines changed: 28 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -19,21 +19,37 @@ general:
1919
logging:
2020
console:
2121
_type: console
22+
<<<<<<< HEAD:industries/asset_lifecycle_management_agent/configs/config-reasoning.yml
2223
level: DEBUG
2324
# level: INFO
2425
# file:
2526
# _type: file
2627
# path: "alm.log"
2728
# level: DEBUG
29+
=======
30+
level: INFO
31+
file:
32+
_type: file
33+
path: "pdm.log"
34+
level: DEBUG
35+
>>>>>>> fbde275 (Trying to rebase with main and commit the files again):industries/predictive_maintenance_agent/configs/config-reasoning.yml
2836
tracing:
2937
phoenix:
3038
_type: phoenix
3139
endpoint: http://localhost:6006/v1/traces
40+
<<<<<<< HEAD:industries/asset_lifecycle_management_agent/configs/config-reasoning.yml
3241
project: alm-agent
3342
# catalyst:
3443
# _type: catalyst
3544
# project: "alm-agent"
3645
# dataset: "alm-agent"
46+
=======
47+
project: pdm-demo-day
48+
catalyst:
49+
_type: catalyst
50+
project: "pdm-demo-day"
51+
dataset: "pdm-demo-day"
52+
>>>>>>> fbde275 (Trying to rebase with main and commit the files again):industries/predictive_maintenance_agent/configs/config-reasoning.yml
3753

3854
llms:
3955
# SQL query generation model
@@ -43,13 +59,11 @@ llms:
4359

4460
# Data analysis and tool calling model
4561
analyst_llm:
46-
_type: nim
47-
model_name: "qwen/qwen2.5-coder-32b-instruct"
48-
# _type: openai
49-
# model_name: "gpt-4.1-mini"
62+
_type: openai
63+
model_name: "gpt-4.1-mini"
5064

5165
# Python code generation model
52-
coding_llm:
66+
coding_llm:
5367
_type: nim
5468
model_name: "qwen/qwen2.5-coder-32b-instruct"
5569

@@ -67,15 +81,20 @@ embedders:
6781
# Text embedding model for vector database operations
6882
vanna_embedder:
6983
_type: nim
70-
model_name: "nvidia/nv-embed-v1"
84+
model_name: "nvidia/llama-3.2-nv-embedqa-1b-v2"
7185

7286
functions:
7387
sql_retriever:
7488
_type: generate_sql_query_and_retrieve_tool
7589
llm_name: sql_llm
7690
embedding_name: vanna_embedder
91+
# Vector store configuration
92+
vector_store_type: chromadb # Optional, chromadb is default
7793
vector_store_path: "database"
78-
db_path: "database/nasa_turbo.db"
94+
# Database configuration
95+
db_type: sqlite # Optional, sqlite is default
96+
db_connection_string_or_path: "database/nasa_turbo.db"
97+
# Output configuration
7998
output_folder: "output_data"
8099
vanna_training_data_path: "vanna_training_data.yaml"
81100

@@ -129,8 +148,8 @@ functions:
129148
plot_line_chart,
130149
plot_comparison,
131150
anomaly_detection,
132-
plot_anomaly,
133-
code_generation_assistant
151+
plot_anomaly
152+
# code_generation_assistant
134153
]
135154
parse_agent_response_max_retries: 2
136155
system_prompt: |

industries/asset_lifecycle_management_agent/pyproject.toml

Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@ dependencies = [
1111
"pydantic ~= 2.10.0, <2.11.0",
1212
"vanna==0.7.9",
1313
"chromadb",
14+
"sqlalchemy>=2.0.0",
1415
"xgboost",
1516
"matplotlib",
1617
"torch",
@@ -23,6 +24,36 @@ classifiers = ["Programming Language :: Python"]
2324
authors = [{ name = "Vineeth Kalluru" }]
2425
maintainers = [{ name = "NVIDIA Corporation" }]
2526

27+
[project.optional-dependencies]
28+
elasticsearch = [
29+
"elasticsearch>=8.0.0"
30+
]
31+
postgres = [
32+
"psycopg2-binary>=2.9.0"
33+
]
34+
mysql = [
35+
"pymysql>=1.0.0"
36+
]
37+
sqlserver = [
38+
"pyodbc>=4.0.0"
39+
]
40+
oracle = [
41+
"cx_Oracle>=8.0.0"
42+
]
43+
all-databases = [
44+
"psycopg2-binary>=2.9.0",
45+
"pymysql>=1.0.0",
46+
"pyodbc>=4.0.0",
47+
"cx_Oracle>=8.0.0"
48+
]
49+
all = [
50+
"elasticsearch>=8.0.0",
51+
"psycopg2-binary>=2.9.0",
52+
"pymysql>=1.0.0",
53+
"pyodbc>=4.0.0",
54+
"cx_Oracle>=8.0.0"
55+
]
56+
2657
[project.entry-points.'nat.components']
2758
asset_lifecycle_management_agent = "asset_lifecycle_management_agent.register"
2859

0 commit comments

Comments
 (0)