A fully local, privacy-first personal AI assistant. Ask questions about your own data — emails, documents, chats — without anything leaving your machine.
Stack: Ollama · Qdrant · Streamlit · Python (uv monorepo)
Prerequisites: Docker & Docker Compose
git clone https://github.com/your-username/myindependent-ai.git
cd myindependent-ai
docker compose upOpen http://localhost:8501 — that's it.
First run:
docker compose upautomatically pullsllama3.2(~2 GB) andnomic-embed-text(~300 MB) via theollama-initservice. Subsequent starts are instant.
| Service | URL | Purpose |
|---|---|---|
| Dashboard | http://localhost:8501 | Streamlit chat UI |
| Ollama | http://localhost:11434 | Local LLM + embeddings |
| Qdrant | http://localhost:6333 | Local vector database |
The dashboard searches whatever is in your Qdrant personal_data collection. To populate it, run one of the importers:
# Gmail (requires OAuth credentials)
uv run python -m importers.orchestrator --importer gmail
# WhatsApp exports
uv run python -m importers.orchestrator --importer whatsapp
# Files from Synology NAS
uv run python -m importers.orchestrator --importer synology-nasSee docs/CONTRIBUTING.md for adding new importers or setting up your dev environment.
# Install all dependencies
uv sync --all-packages --group dev
# Run the dashboard
uv run streamlit run apps/admin-dashboard/app.py
# Run tests
uv run pytestYou'll need Ollama and Qdrant running separately:
ollama serve # terminal 1
docker run -p 6333:6333 qdrant/qdrant # terminal 2Copy .env.example to .env and adjust if needed:
cp .env.example .envKey variables:
| Variable | Default | Description |
|---|---|---|
OLLAMA_BASE_URL |
http://ollama:11434 |
Ollama API URL |
QDRANT_URL |
http://qdrant:6333 |
Qdrant URL |
MAPPING_DB_PATH |
/data/mapping.db |
PII mapping database path |
.
├── apps/
│ ├── admin-dashboard/ # Streamlit chat UI
│ ├── importers/ # Data ingestion (Gmail, WhatsApp, etc.)
│ └── orchestrator/ # Ingestion pipeline runner
├── libs/
│ ├── embedding/ # Ollama embedding wrapper
│ ├── vector-storage/ # Qdrant client wrapper
│ └── privacy-core/ # PII scrubbing (Presidio + SQLite)
├── infrastructure/ # Terraform for GCP deployment
├── scripts/ # Operational helpers
├── docker-compose.yml # Local stack (Ollama + Qdrant + Dashboard)
└── pyproject.toml # uv workspace root
See docs/architecture.md for the full system design. For GCP/cloud deployment, see infrastructure/.
See Technical Roadmap for our technical roadmap.