Skip to content

Commit c25882e

Browse files
committed
docs: clarify data flow transparency - raw data local, results to LLM
- README: diagram now shows analysis results flow to LLM provider - README: 'Privacy-first' changed to 'Local processing' with caveat - SECURITY.md: detailed explanation of what stays local vs what reaches LLM - Both files recommend local LLM (Ollama) for full air-gapped privacy
1 parent 2a36278 commit c25882e

File tree

2 files changed

+16
-8
lines changed

2 files changed

+16
-8
lines changed

README.md

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -63,8 +63,11 @@ This project is built around the **Model Context Protocol (MCP)** — an open st
6363
6464
6565
┌──────────────────────────────────────────────────────────────┐
66-
│ YOUR DATA (stays on your machine — privacy-first)
66+
│ YOUR DATA (raw files stay on your machine)
6767
│ Vibration signals · Equipment manuals · Trained models │
68+
│ │
69+
│ ⚠️ Analysis results (peaks, RMS, diagnoses) flow back │
70+
│ to the LLM provider. Use a local LLM for full air-gap. │
6871
└──────────────────────────────────────────────────────────────┘
6972
```
7073

@@ -73,7 +76,7 @@ This project is built around the **Model Context Protocol (MCP)** — an open st
7376
### What This Means For You
7477

7578
- **🔌 Plug-and-play** — Add new analysis tools (thermography, oil analysis, acoustics) as simple Python functions — the LLM discovers them automatically
76-
- **🔒 Privacy-first**All data stays on your machine; only tool calls and results flow through the LLM
79+
- **🔒 Local processing**Raw signals never leave your machine; only computed results (peaks, RMS, diagnoses) flow to the LLM. Use a [local LLM](https://ollama.com/) for full air-gapped privacy
7780
- **🤖 LLM-agnostic** — Works with Claude, ChatGPT, or any MCP-compatible client
7881
- **🧱 Modular** — Use only the tools you need, extend with your own
7982

SECURITY.md

Lines changed: 11 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -30,10 +30,14 @@ If you discover a security vulnerability in this project, please report it respo
3030

3131
This MCP server is designed to run **locally** on your machine:
3232

33-
- All data processing happens locally (no cloud transmission)
34-
- Signal data and reports remain on the local filesystem
35-
- ML models are trained and stored locally
36-
- No network requests are made during normal operation
33+
- All signal processing (FFT, envelope, statistics, ML) runs locally — no third-party analytics APIs
34+
- Raw signal files (CSV), equipment manuals (PDF), and trained ML models never leave your filesystem
35+
- HTML reports are generated and stored locally
36+
- The MCP server itself makes **no network requests** during operation
37+
38+
> **Important**: While the server processes data locally, the **analysis results** (peak frequencies, RMS values, statistical summaries, diagnostic text, manual excerpts) are returned to the LLM client and transmitted to the LLM provider's API (e.g., Anthropic, OpenAI). This is inherent to any LLM-based workflow — the LLM needs tool outputs to generate responses. Raw signal arrays are never sent in full; only computed summaries and metrics flow through the LLM.
39+
>
40+
> **To maximize privacy**: Use a local LLM (e.g., Ollama, LM Studio) as your MCP client — this keeps the entire pipeline on your machine with zero data leaving your network.
3741
3842
### File System Access
3943

@@ -58,6 +62,7 @@ The server accesses the local filesystem for:
5862
### Data Privacy
5963

6064
- **No telemetry**: The server does not collect or transmit usage data
61-
- **No external APIs**: All analysis runs locally without internet
65+
- **No external APIs**: The server itself makes no network calls — all analysis runs locally
6266
- **Sample data**: Included dataset is from public research sources (MathWorks)
63-
- **Your data**: Any proprietary vibration data you add stays on your machine
67+
- **Raw data stays local**: Your proprietary vibration signals, manuals, and models remain on your filesystem
68+
- **Analysis results flow to LLM**: Computed metrics (frequencies, RMS, kurtosis, diagnoses) are returned to the LLM provider as tool outputs — this is inherent to MCP-based workflows. For full air-gapped privacy, use a local LLM client

0 commit comments

Comments
 (0)