Contextual intelligence for everyone. An AI assistant that actually knows your body.
Built for the Anthropic Claude Code Hackathon (Feb 10-16, 2026)
Generic AI assistants give generic advice. "Get 8 hours of sleep." "Save money."
But YOUR body is unique. Your patterns matter.
AEGIS is a voice-first AI pendant that combines health tracking with wealth management, making advice contextually intelligent by understanding YOUR actual patterns.
Instead of generic advice, AEGIS says:
- "You averaged 6 hours on weekdays vs 7.9 on weekends — that 2-hour sleep debt explains your weekday fatigue."
- "You spent $12 on coffee after 5 hours of sleep. Your spending spikes when you're tired."
- "For 5K training, do lunchtime walks — evening exercise will hurt your already-limited weekday sleep."
┌─────────────┐
│ ESP32 │ 16kHz PCM audio
│ (Pendant) │ ←────────────────┐
│ │ │
│ • INMP441 │ │
│ • PAM8403 │ │
│ • WiFi │ │
└─────────────┘ │
│
WebSocket (binary)
│
┌─────────────────────────────────┼───────────────────────────┐
│ BRIDGE SERVER (FastAPI) ↓ │
│ │
│ ┌──────────┐ ┌─────────────┐ ┌──────────────┐ │
│ │ STT │ → │ CLAUDE │ → │ TTS │ │
│ │ faster- │ │ STREAMING │ │ Kokoro │ │
│ │ whisper │ │ │ │ (82M) │ │
│ └──────────┘ └─────────────┘ └──────────────┘ │
│ ↓ │
│ ┌─────────────┐ │
│ │ TOOLS │ │
│ │ • Health×3 │ │
│ │ • Wealth×3 │ │
│ │ • Insight×1 │ │
│ └─────────────┘ │
│ ↓ │
│ ┌─────────────┐ │
│ │ SQLite │ │
│ │ • health │ │
│ │ • expenses │ │
│ │ • insights │ │
│ └─────────────┘ │
└──────────────────────────────────────────────────────────────┘
-
Dual Model System
- Haiku 4.5: Fast path (<500ms) for simple queries
- Opus 4.6: Deep analysis with interleaved thinking (THE showcase feature)
-
3-Layer System Prompt (with prompt caching)
- Layer 1: Static persona (cached)
- Layer 2: Dynamic health context (7-day snapshot, regenerated per request)
- Layer 3: Tool directives (cached)
-
True Streaming
messages.stream()for token-by-token delivery- TTS starts on first sentence boundary (not after full response)
- Perceived latency: ~150ms vs ~500ms with blocking API
-
7 Tools
- Health: get_health_context, log_health, analyze_health_patterns
- Wealth: track_expense, get_spending_summary, calculate_savings_goal
- Insight: save_user_insight (Claude saves discovered patterns)
Our seed data creates dramatic, visible patterns:
| Metric | Weekday | Weekend | Impact |
|---|---|---|---|
| Sleep | 6.0h avg | 7.9h avg | 1.9h deficit |
| Mood | 2.5/5 | 3.8/5 | Clear correlation with sleep |
| Exercise | 10 min | 45 min | Weekend warrior pattern |
| Food spending | $8.9/day | $34-41/day | 4-5x spike (dining out) |
Sleep-Mood Correlation: 5h sleep → 2.3 mood, 8h sleep → 3.8 mood
This makes Claude responses dramatically contextual.
cd /Users/apple/Documents/Aegis
python3 -m venv .venv
source .venv/bin/activate
pip install -r aegis/requirements.txtcp aegis/.env.example .env
# Add your ANTHROPIC_API_KEY=sk-ant-xxxxxpython3 -m aegis.mainServer runs on http://localhost:8000
Navigate to http://localhost:8000 in browser
python3 -m aegis terminalTry:
- "How did I sleep this week?"
- "I spent $45 on dinner"
- "Why am I tired on weekdays?" (triggers Opus!)
- "Help me train for a 5K"
python3 -m aegis seedPre-loads health logs and expense tracking for demo queries.
- Interleaved thinking enabled with beta header
- Thinks BETWEEN tool calls (not just before/after)
- Budget: 10,000 tokens for deep reasoning
- Visible in logs and dashboard
- Haiku TTFT: <200ms (time to first token)
- Perceived latency: ~150ms (streaming + sentence buffering)
- Opus latency: ~2s (acceptable for deep analysis)
- STT: faster-whisper (27M model, CPU-friendly)
- TTS: Kokoro (82M, Apache 2.0, high quality)
- Prompt caching: Static prompt layers cached (saves latency + cost)
- Streaming: Token-by-token delivery (TTS starts immediately)
- Smart routing: Haiku for 80% of queries, Opus for complex 20%
| Component | Lines | Status |
|---|---|---|
| Server | ~500 | ✅ Working |
| Tools | ~400 | ✅ 7/7 |
| Database | ~300 | ✅ Working |
| STT/TTS | ~400 | ✅ Working |
| Tests | 136+ | 🟢 Passing |
| Dashboard | ~300 | ✅ Ready |
| Total | ~2,850 | Ready |
Well under 5,000 line target. All critical paths tested and working.
- Not just a chatbot — It's body-aware
- Not just health tracking — It connects health + wealth
- Not generic advice — It knows YOUR patterns
- Not slow — Streaming makes it feel instant
- Not complex — Clean architecture, <3K lines
- Apple Health integration: Real iOS/watchOS data
- Google Fit integration: Android users
- Local LLM fallback: Phi-3-mini for offline queries
- Voice cloning: Personalized TTS voice
- Multi-user: Family health tracking
ESP32 DevKit V1 with INMP441 microphone and PAM8403 audio amplifier
Aegis concept and prototype design
- Claude Opus 4.6 (interleaved thinking) + Haiku 4.5 (speed)
- FastAPI + WebSockets (streaming)
- faster-whisper (STT, 27M model)
- Kokoro TTS (82M, Apache 2.0)
- SQLite (embedded, zero setup)
- ESP32 (microcontroller, $5 hardware)
- Server starts without errors
- WebSocket connection to ESP32 (simulated)
- All 7 tools execute correctly
- STT pipeline working (faster-whisper)
- TTS synthesis working (Kokoro)
- Claude API integration (Opus 4.6 + Haiku 4.5)
- Database operations (health logs, expenses, insights)
- All critical paths tested (136+ tests)
- Tests passing on main branch
- Code review completed (CodeRabbit)
- Production discipline enforced
- Error handling comprehensive
- Logging implemented
- <3,000 lines of code
- Sample data seeded
- Dashboard functional
- Terminal demo working
- WebSocket protocol complete
- Response streaming implemented
- Latency metrics tracked
- README comprehensive
- Architecture documented
- API contracts defined
- Hardware setup documented
- Deployment instructions clear
- Add
ANTHROPIC_API_KEYto.env - Run
python3 -m aegis seed(optional) - Run
python3 -m aegis.mainto start server - Open
http://localhost:8000in browser - Use
python3 -m aegis terminalfor text queries
Demo Time: ~5 minutes for full feature walkthrough
MIT
Built for the Anthropic Claude Code Hackathon using Claude Opus 4.6's extended thinking capabilities.
Team: Solo developer showcase
Time: 3 days (Feb 13-15, 2026)
Hardware cost: ~$15 (ESP32 + mic + speaker)
The future of AI is contextual. AEGIS is just the beginning.