You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
docs: clarify data flow transparency - raw data local, results to LLM
- README: diagram now shows analysis results flow to LLM provider
- README: 'Privacy-first' changed to 'Local processing' with caveat
- SECURITY.md: detailed explanation of what stays local vs what reaches LLM
- Both files recommend local LLM (Ollama) for full air-gapped privacy
@@ -73,7 +76,7 @@ This project is built around the **Model Context Protocol (MCP)** — an open st
73
76
### What This Means For You
74
77
75
78
-**🔌 Plug-and-play** — Add new analysis tools (thermography, oil analysis, acoustics) as simple Python functions — the LLM discovers them automatically
76
-
-**🔒 Privacy-first** — All data stays on your machine; only tool calls and results flow through the LLM
79
+
-**🔒 Local processing** — Raw signals never leave your machine; only computed results (peaks, RMS, diagnoses) flow to the LLM. Use a [local LLM](https://ollama.com/) for full air-gapped privacy
77
80
-**🤖 LLM-agnostic** — Works with Claude, ChatGPT, or any MCP-compatible client
78
81
-**🧱 Modular** — Use only the tools you need, extend with your own
Copy file name to clipboardExpand all lines: SECURITY.md
+11-6Lines changed: 11 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -30,10 +30,14 @@ If you discover a security vulnerability in this project, please report it respo
30
30
31
31
This MCP server is designed to run **locally** on your machine:
32
32
33
-
- All data processing happens locally (no cloud transmission)
34
-
- Signal data and reports remain on the local filesystem
35
-
- ML models are trained and stored locally
36
-
- No network requests are made during normal operation
33
+
- All signal processing (FFT, envelope, statistics, ML) runs locally — no third-party analytics APIs
34
+
- Raw signal files (CSV), equipment manuals (PDF), and trained ML models never leave your filesystem
35
+
- HTML reports are generated and stored locally
36
+
- The MCP server itself makes **no network requests** during operation
37
+
38
+
> **Important**: While the server processes data locally, the **analysis results** (peak frequencies, RMS values, statistical summaries, diagnostic text, manual excerpts) are returned to the LLM client and transmitted to the LLM provider's API (e.g., Anthropic, OpenAI). This is inherent to any LLM-based workflow — the LLM needs tool outputs to generate responses. Raw signal arrays are never sent in full; only computed summaries and metrics flow through the LLM.
39
+
>
40
+
> **To maximize privacy**: Use a local LLM (e.g., Ollama, LM Studio) as your MCP client — this keeps the entire pipeline on your machine with zero data leaving your network.
37
41
38
42
### File System Access
39
43
@@ -58,6 +62,7 @@ The server accesses the local filesystem for:
58
62
### Data Privacy
59
63
60
64
-**No telemetry**: The server does not collect or transmit usage data
61
-
-**No external APIs**: All analysis runs locally without internet
65
+
-**No external APIs**: The server itself makes no network calls — all analysis runs locally
62
66
-**Sample data**: Included dataset is from public research sources (MathWorks)
63
-
-**Your data**: Any proprietary vibration data you add stays on your machine
67
+
-**Raw data stays local**: Your proprietary vibration signals, manuals, and models remain on your filesystem
68
+
-**Analysis results flow to LLM**: Computed metrics (frequencies, RMS, kurtosis, diagnoses) are returned to the LLM provider as tool outputs — this is inherent to MCP-based workflows. For full air-gapped privacy, use a local LLM client
0 commit comments