DevTrace AI is your team's permanent debugging memory - log bugs, get instant AI analysis, save what works, and debug with teammates in real time at both session and project level. Works offline. Remembers everything.
| 🔍 | Every bug gets a permanent structured record |
| 🤖 | Full AI breakdown - root cause, fixes, timeline, 8-tab panel |
| 🧬 | Debug DNA - your personal error fingerprint |
| 🧠 | Hybrid Local-First RAG - semantic vector search + structured SQL, entirely on-device |
| 🔁 | Similar Sessions - instantly finds bugs you've seen before |
| 👥 | Session Collaboration - shared checklist, presence, team chat |
| 📋 | Project Collaboration - activity feed, project chat, project presence |
| 🤖 | Mastra AI Agents - Session Debugger + Project Analyzer via Mastra Cloud |
| 📶 | Fully offline via PowerSync local SQLite (11 tables, 5 sync buckets) |
| 🔗 | Share projects and sessions with teammates |
The core problem it solves: Debugging is slow and scattered. You repeat the same mistakes, forget what fixed what, and lose context every time you close a tab. DevTrace AI is your team's permanent debugging memory - and it works even when the internet doesn't.
1. You paste an error -> Log a debug session (error, stack trace, code, severity)
2. Embeddings generated -> transformers.js generates on-device semantic vectors instantly
3. Hybrid search triggers -> Cosine similarity + keyword scoring against local SQLite history
4. Click "Analyze Bug" -> Groq + Llama 3.3 70B returns full structured analysis server-side
5. Read the 8-tab breakdown -> Overview, Fixes, Timeline, Checklist, Chat, Tests, Logs, Structure
6. Run Mastra Deep Analysis -> Session Debugger agent reasons through stack trace, returns diff fix
7. Invite a teammate -> They join the session - presence, checklist, and chat sync live
8. Watch the activity feed -> Every session event logged to project feed, visible to all collaborators
9. Save what worked -> Fix goes to Fix Library, tagged and searchable forever
10. Generate Debug DNA -> Supabase Edge Function analyzes your patterns + Groq writes fingerprint
Hybrid RAG Match - Local-First Semantic Search

Mastra Deep Analysis - Session Debugger Agent

Mastra Project Analysis - Project Analyzer Agent

Project Collaboration - Live Presence & Project Chat

Project Collaboration - Activity Feed & Chat Sync

All reads come from a local SQLite database (PowerSync). Zero network latency - instant.
All writes go through PowerSync's mutation queue - written to local SQLite first, then uploaded to Supabase automatically. Large blobs like ai_analysis bypass the mutation queue and go direct to Supabase, then sync back down via WAL.
WRITE (small fields) -> powerSync.execute() -> Local SQLite -> PowerSync uploads -> Supabase Postgres
|
WRITE (ai_analysis) -> supabase.update() -> Supabase Postgres |
| |
PowerSync WAL listener <----------------------- |
|
READ <- useQuery() from @powersync/react <- Local SQLite (0ms, no spinner)
Offline? powerSync.execute() writes to local SQLite and queues the upload automatically. The moment you reconnect, PowerSync flushes the queue to Supabase with no extra code needed.
Every session gets a full structured breakdown powered by Groq + Llama 3.3 70B - called server-side via a Supabase Edge Function. The Groq API key is never exposed to the browser. The complete analysis is saved as JSONB in Supabase - persists across reloads, no re-analyzing needed.
- 🔍 Overview - Plain English explanation, root cause, symptom vs cause, category badge, confidence score, files to check
- ⚡ Fixes - 3 options (quick patch, proper fix, workaround) each with full code & pros/cons
- 🕐 Timeline - Visual step-by-step of how the crash happened from component mount to error throw
- ✅ Checklist - Shared interactive checklist - syncs live across all collaborators via PowerSync
- 💬 Followup - Context-aware AI chat - click suggested questions or type your own
- 🧪 Tests - AI-generated reproduction steps and test cases to verify the fix works
- 📋 Logs - Paste raw console or server logs - AI strips noise and surfaces what matters
- 🏗️ Structure - Paste your file tree - AI reviews architecture and flags problems
The entire analysis is rate-limited at 20 AI requests per user per hour via a rate_limits table in Supabase - enforced server-side in the Edge Function before any Groq call is made.
DevTrace AI adds a second AI layer on top of standard analysis - two specialized Mastra Cloud agents that go deeper than the standard 8-tab panel.
Client clicks "Run Deep Analysis"
|
POST /functions/v1/mastra-agent (JWT verified)
|
Supabase Edge Function proxies to Mastra Cloud
|
Session Debugger or Project Analyzer agent reasons through the data
|
Structured JSON response returned - diff-format fix, risk analysis, recommendations
|
Rich UI renders sections: root cause, before/after diff, verification steps, risks
- Identifies the exact broken line with a before/after diff-format fix
- Explains why this error pattern occurs specifically in your language/framework
- Provides alternative approaches with tradeoff analysis
- Lists verification steps to confirm the fix worked
- Flags related risks that might surface after applying the fix
- Detects recurring error patterns across all sessions with frequency counts
- Identifies systemic architectural issues causing multiple errors
- Provides a health verdict: Excellent / Good / Needs Attention / Critical
- Generates prioritized recommendations: Immediate / Short-term / Long-term
- Analyzes resolution trends - what gets fixed quickly, what lingers, and why
Both agents are called via a JWT-verified Supabase Edge Function. The Mastra API key never reaches the browser.
DevTrace AI implements a sophisticated Retrieval-Augmented Generation layer that runs entirely on the edge - no server, no network, no latency.
Error logged
|
transformers.js (Xenova/all-MiniLM-L6-v2) generates 384-dimension embedding in browser
|
Embedding stored in error_embedding column via powerSync.execute()
|
PowerSync syncs embedding to all devices via WAL
|
Open any session - hybrid search fires instantly
|
Layer 1: Keyword scoring - token overlap against error_message in local SQLite
Layer 2: Semantic scoring - cosine similarity against stored embeddings
|
Top matches surfaced with confidence score - zero network, works offline
- On-Device Embeddings -
transformers.js(Xenova/all-MiniLM-L6-v2) generates vectors entirely in the browser. No API call, no server, no cost per query - SQLite Vector Store - PowerSync keeps embeddings synced and available in local SQLite across all devices
- Dual Scoring - keyword overlap catches exact matches; cosine similarity catches semantically related bugs with different wording
- Works Offline - the entire retrieval layer runs on local SQLite - pattern matching is available even without internet
When you're offline and open a session without prior AI analysis, DevTrace AI doesn't just show a spinner. It synthesizes guidance from your local debugging history using a multi-layer aggregation engine.
Offline - open a new session with an error message
|
useOfflineMemory extracts meaningful tokens (strips noise words)
|
powerSync.getAll() queries local SQLite for sessions with ai_analysis
|
Sessions scored by token overlap - top 5 matches retrieved
|
Knowledge extracted: root causes, fixes, checklist items, test cases, files
|
Voted synthesis: most common root cause surfaced as primary likely cause
|
OfflineAssistCard renders - clearly labeled as synthesized from local history
- Confidence levels - High / Medium / Low based on match quality and count
- Evidence linked - every suggestion shows which past sessions it came from
- Expandable fixes - best past fixes with full code, expandable inline
- Never misleads - result is clearly labeled as synthesized offline guidance, not fresh AI analysis
DevTrace AI turns a debug session into a shared live workspace. All powered by PowerSync WAL sync - no WebSocket server, no Supabase Realtime subscription, no polling.
Owner opens session
|
Teammate opens the shared session
|
Presence row written to session_presence via powerSync.execute()
|
PowerSync WAL syncs instantly to owner's local SQLite
|
Owner sees "Teammate is debugging with you" banner - live dot pulsing
|
Both can check off checklist items - syncs to all participants instantly
|
Both can send chat messages - delivered via PowerSync, zero polling
Three tables power session collaboration - all synced via PowerSync WAL:
session_presence - one row per user per session, last_seen_at updated every 30s
session_checklist - one row per checklist item, checked/unchecked state + who did it
session_chat - flat message log tied to the session
// Presence heartbeat - fires on mount, every 30s, cleans up on unmount
await powerSync.execute(
`INSERT INTO session_presence (id, session_id, user_id, display_name, last_seen_at, joined_at)
VALUES (?, ?, ?, ?, ?, ?)`,
[id, sessionId, userId, displayName, now, now]
);
// Checklist toggle - syncs to all collaborators instantly
await powerSync.execute(
`UPDATE session_checklist SET checked = ?, checked_by_name = ? WHERE session_id = ? AND item_index = ?`,
[1, displayName, sessionId, itemIndex]
);
// Chat message - delivered via PowerSync WAL
await powerSync.execute(
`INSERT INTO session_chat (id, session_id, user_id, display_name, message, created_at)
VALUES (?, ?, ?, ?, ?, ?)`,
[id, sessionId, userId, displayName, message, now]
);Collaboration extends beyond individual sessions to the entire project. Every session mutation - create, resolve, analyze, update, delete - is automatically logged to a project activity feed and synced to all collaborators instantly.
Owner opens project
|
Teammate opens the shared project
|
project_presence row written via powerSync.execute()
|
Owner sees avatar stack + live dot in project header
|
Joel creates a new session -> project_activity row logged automatically
|
Sarah sees "Joel created Bug: Auth token expired" in activity feed instantly
|
Sarah resolves a session -> "Sarah resolved Bug: Auth token expired" logged
|
Both can chat at project level - Project Chat syncs via PowerSync
Activity logging is wired directly into useSessions.ts - no extra call-site code needed:
// Auto-logged on session create
await logProjectActivity(user, projectId, 'session_created', id, title);
// Auto-logged when status changes to resolved
await logProjectActivity(user, projectId, 'session_resolved', id, title);
// Auto-logged when AI analysis runs
await logProjectActivity(user, projectId, 'session_analyzed', id, title);Supabase is the source of truth and auth backbone for the entire app.
- Email + Password -
supabase.auth.signInWithPassword() - GitHub OAuth -
signInWithOAuth({ provider: 'github' }) - Google OAuth -
signInWithOAuth({ provider: 'google' }) - Password Reset -
resetPasswordForEmail()-> branded magic link ->/reset-password - GitHub Linking -
linkIdentity({ provider: 'github' })-> username saved toprofiles - Session sync -
onAuthStateChange()keeps ZustandauthStorelive across all tabs
Every table has Row Level Security enabled.
| Table | Purpose |
|---|---|
profiles | User name, avatar, GitHub connection, dark mode preference |
projects | Project groupings with GitHub URL and health metrics |
debug_sessions | Full session data including ai_analysis JSONB and error_embedding vector |
fixes | Fix library entries with tags and use count |
shares | Access grants between users for projects and sessions |
session_presence | Live presence per user per session, heartbeat every 30s |
session_checklist | Shared checklist state - one row per item, who checked what |
session_chat | Team chat messages tied to a session |
project_presence | Live presence per user per project, heartbeat every 30s |
project_activity | Event log - session created/resolved/analyzed/updated/deleted |
project_chat | Team chat messages tied to a project |
analyze-bug - handles all Groq AI calls server-side:
- JWT verified on every request - unauthorized calls rejected before touching Groq
- Routes four actions:
analyzeSession,sendFollowUp,analyzeLogs,analyzeStructure - Rate limited: 20 requests per user per hour via
rate_limitstable - Groq API key stored in Supabase Secrets - never in the browser
debug-dna - generates your personal debugging fingerprint:
- Service role key queries Postgres directly server-side
- SQL aggregations across your full session history + Groq narrative
mastra-agent - proxies calls to Mastra Cloud agents:
- JWT verified before touching Mastra API key
- Routes
debugSessionto the Session Debugger agent - Routes
analyzeProjectto the Project Analyzer agent - Forces structured JSON output - rich UI renders sections, diffs, badges
-- All 11 tables replicated via WAL
alter publication powersync add table
profiles, projects, debug_sessions, fixes, shares,
session_presence, session_checklist, session_chat,
project_presence, project_activity, project_chat;PowerSync is the offline engine and real-time collaboration layer - powering both session-level and project-level collaboration with zero custom backend code.
// All zero-network reads - local SQLite
const { data: sessions } = useQuery('SELECT * FROM debug_sessions WHERE user_id = ?', [uid]);
const { data: collaborators } = useQuery('SELECT * FROM session_presence WHERE session_id = ?', [id]);
const { data: checklist } = useQuery('SELECT * FROM session_checklist WHERE session_id = ?', [id]);
const { data: sessionMessages } = useQuery('SELECT * FROM session_chat WHERE session_id = ?', [id]);
const { data: projectPresence } = useQuery('SELECT * FROM project_presence WHERE project_id = ?', [pid]);
const { data: activityFeed } = useQuery('SELECT * FROM project_activity WHERE project_id = ? ORDER BY created_at DESC LIMIT 50', [pid]);
const { data: projectMessages } = useQuery('SELECT * FROM project_chat WHERE project_id = ?', [pid]);// Session collaboration
await powerSync.execute(`INSERT INTO session_presence ...`, [...]);
await powerSync.execute(`UPDATE session_checklist SET checked = ? ...`, [...]);
await powerSync.execute(`INSERT INTO session_chat ...`, [...]);
// Project collaboration
await powerSync.execute(`INSERT INTO project_presence ...`, [...]);
await powerSync.execute(`INSERT INTO project_activity ...`, [...]); // auto-logged by useSessions
await powerSync.execute(`INSERT INTO project_chat ...`, [...]);
// Embeddings stored alongside session data
await powerSync.execute(`UPDATE debug_sessions SET error_embedding = ? ...`, [JSON.stringify(embedding), id]);Large blob exception: ai_analysis goes direct to Supabase to avoid overloading the WASM crud reader, then syncs back via WAL.
| State | What happens |
|---|---|
| 🟢 App opens online | PowerSync connects and streams latest changes from Supabase |
| 🟢 User reads data | useQuery() returns from local SQLite - instant, 0ms |
| 🟢 User opens a session | Hybrid RAG fires, presence heartbeat fires, all collab state loads |
| 🟢 Teammate joins session | Presence row syncs via WAL - owner sees banner within 1-2 seconds |
| 🟢 Session resolved | project_activity row logged automatically - all collaborators see it |
| 🟠 Internet drops | Orange banner appears - all reads still work, writes queue locally |
| 🟠 User opens offline session | Offline Memory Assist synthesizes guidance from local SQLite history |
| 🟠 User creates offline | powerSync.execute() writes to SQLite, upload queued automatically |
| 🟢 Internet returns | PowerSync flushes queue to Supabase, WAL syncs delta back down |
{
"bucket_definitions": {
"user_data": {
"parameters": "SELECT request.user_id() as user_id",
"data": [
"SELECT * FROM profiles WHERE id = bucket.user_id",
"SELECT * FROM projects WHERE user_id = bucket.user_id",
"SELECT * FROM debug_sessions WHERE user_id = bucket.user_id",
"SELECT * FROM fixes WHERE user_id = bucket.user_id",
"SELECT * FROM shares WHERE owner_id = bucket.user_id"
]
},
"shared_sessions": {
"parameters": "SELECT resource_id as session_id FROM shares WHERE invitee_id = request.user_id() AND resource_type = 'session'",
"data": [
"SELECT * FROM debug_sessions WHERE id = bucket.session_id",
"SELECT * FROM session_presence WHERE session_id = bucket.session_id",
"SELECT * FROM session_checklist WHERE session_id = bucket.session_id",
"SELECT * FROM session_chat WHERE session_id = bucket.session_id"
]
},
"owned_session_collab": {
"parameters": "SELECT id as session_id FROM debug_sessions WHERE user_id = request.user_id()",
"data": [
"SELECT * FROM session_presence WHERE session_id = bucket.session_id",
"SELECT * FROM session_checklist WHERE session_id = bucket.session_id",
"SELECT * FROM session_chat WHERE session_id = bucket.session_id"
]
},
"owned_project_collab": {
"parameters": "SELECT id as project_id FROM projects WHERE user_id = request.user_id()",
"data": [
"SELECT * FROM project_presence WHERE project_id = bucket.project_id",
"SELECT * FROM project_activity WHERE project_id = bucket.project_id",
"SELECT * FROM project_chat WHERE project_id = bucket.project_id"
]
},
"shared_projects": {
"parameters": "SELECT resource_id as project_id FROM shares WHERE invitee_id = request.user_id() AND resource_type = 'project'",
"data": [
"SELECT * FROM projects WHERE id = bucket.project_id",
"SELECT * FROM debug_sessions WHERE project_id = bucket.project_id",
"SELECT * FROM project_presence WHERE project_id = bucket.project_id",
"SELECT * FROM project_activity WHERE project_id = bucket.project_id",
"SELECT * FROM project_chat WHERE project_id = bucket.project_id"
]
}
}
}- Session Tracking - Log errors with stack trace, code snippet, expected behavior, environment, and severity
- AI Debug Panel - 8-tab full breakdown via Groq + Llama 3.3 70B server-side, saved permanently as JSONB
- Mastra Session Debugger - Deep analysis agent: exact broken line, diff-format fix, risk flags, verification steps
- Mastra Project Analyzer - Pattern detection agent: recurring bugs, systemic issues, prioritized recommendations
- Hybrid RAG - On-device transformers.js embeddings + keyword scoring against local SQLite - zero network
- Similar Sessions - Finds past bugs with matching error patterns - works offline
- Follow-up Chat - Context-aware AI chat inside every session
- Fix Library - Save working fixes, filter by language, copy in one click, track use count
- Export as Markdown - Export any session as a
.mdfile
- Presence Indicators - Live avatar stack showing who is currently in the session
- Shared Checklist - AI checklist syncs live across all collaborators - shows who checked each item
- Session Chat - Real-time flat message thread tied to the session
- Auto Chat Open - Chat panel opens automatically when a collaborator joins
- Zero Backend Code - Entirely PowerSync WAL-driven, no WebSocket, no polling
- Project Presence - Avatar stack in project header showing who is browsing right now
- Activity Feed - Every session mutation logged as a live event - clickable to navigate to that session
- Project Chat - Team discussion at project level, separate from session chat
- Auto Activity Logging -
useSessionslogs events toproject_activityon every mutation
- Personal Error Fingerprint - Supabase Edge Function queries session history server-side
- AI Narrative - Groq generates personalized profile of your debugging strengths and weaknesses
- Category Resolution Rates - See which error types you crush and which ones beat you
- Weekly Activity Chart - Sessions logged per week over the last 4 weeks
- Export DNA Report - Download your full Debug DNA as Markdown
- Offline-First Reads - All reads from local SQLite via PowerSync - zero network dependency
- Offline Writes -
powerSync.execute()queues mutations locally, auto-uploads on reconnect - Offline Memory Assist - Synthesizes AI guidance from local SQLite history when offline
- On-Device Embeddings - transformers.js generates semantic vectors in the browser - no API call
- Real-Time Sync - PowerSync streams WAL changes to local SQLite instantly
- Sync Status Page - Live row counts across all 11 tables, sync health, upload queue
- Server-side AI - Groq API key and Mastra API key never reach the browser
- JWT Verification - Every Edge Function verifies user JWT before any external API call
- Rate Limiting - 20 AI requests per user per hour, enforced in
analyze-bugEdge Function - RLS on all tables - Every Supabase table has Row Level Security enabled
| Technology | Role | |
|---|---|---|
| ⚛️ | React 18 + TypeScript + Vite | Frontend framework + type safety + build tool |
| 🎨 | Tailwind CSS | Utility-first styling + dark mode |
| 🐻 | Zustand | Lightweight global state (auth, sync queue) |
| 🟢 | Supabase | Postgres · Auth · Storage · RLS · WAL replication · Edge Functions |
| ⚡ | PowerSync | Local SQLite sync · offline mutations · session + project collaboration · embeddings store |
| 🤖 | Groq + Llama 3.3 70B | Server-side AI inference - debug analysis + Debug DNA |
| 🧠 | Mastra Cloud | Session Debugger + Project Analyzer AI agents |
| 🔍 | Xenova/transformers.js | On-device semantic embeddings (all-MiniLM-L6-v2) |
| 📊 | Recharts | Analytics charts and data visualization |
| 🚀 | Vercel | Zero-config deployment + preview URLs |
git clone https://github.com/JexanJoel/DevTrace-AI.git
cd DevTrace-AI
npm installFor full setup instructions - Supabase, PowerSync, Mastra, environment variables - see CONTRIBUTING.md.
DevTrace AI is submitted to the PowerSync AI Hackathon 2026.
| Prize | Why this qualifies |
|---|---|
| 🥇 Core Prize | AI-powered team debugging platform using PowerSync as the state sync layer for humans and AI agents - session + project collaboration, Hybrid RAG, offline AI memory, Mastra agents |
| 🏅 Best Submission Using Supabase | Auth (3 providers), 11 RLS tables, Storage, WAL replication, 3 Edge Functions (analyze-bug with rate limiting, debug-dna, mastra-agent) |
| 🏅 Best Local-First App | All reads from local SQLite, all writes via powerSync.execute(), offline mutations, on-device embeddings stored in SQLite, offline AI memory, session + project collaboration - 11 tables, 5 sync buckets, zero custom backend |
| 🏅 Best Submission Using Mastra | Two specialized Mastra Cloud agents (Session Debugger + Project Analyzer) called via JWT-verified Edge Function proxy with structured JSON output and rich diff-format UI |
How does session collaboration work?
When you open a session, a presence row is written to session_presence via powerSync.execute(). PowerSync syncs this to all users who have access via WAL. The checklist and chat work the same way. No WebSocket, no polling, no custom backend.
How does project collaboration work?
When you open a project, a project_presence row is written. Every session mutation automatically logs an event to project_activity via useSessions. All project collaborators see these events in real time via their local SQLite.
How does the Hybrid RAG work?
When you log a session, transformers.js (Xenova/all-MiniLM-L6-v2) generates a 384-dimension embedding in the browser. This is stored in error_embedding via powerSync.execute(). When you open a session, both keyword scoring and cosine similarity run against your local SQLite history - zero network, works offline.
What are the Mastra agents?
Two agents deployed to Mastra Cloud: Session Debugger (deep analysis of a single error with diff-format fix) and Project Analyzer (pattern detection across all sessions). Both are called via a JWT-verified Supabase Edge Function so the Mastra API key never reaches the browser.
Is the Groq API key safe?
Yes. Stored in Supabase Edge Function Secrets. All AI calls go through analyze-bug which verifies JWT before calling Groq. Rate limited to 20 requests per user per hour.
Does offline mode really work?
Yes. All reads come from local SQLite. Writes queue via powerSync.execute() and upload on reconnect. Similar Sessions, activity feed, embeddings, and all collaboration data read from local SQLite. Offline Memory Assist synthesizes guidance from your history when you have no internet.
Do I need a backend server?
No. Supabase handles auth, database, storage, and Edge Functions. PowerSync handles sync and real-time collaboration. Mastra Cloud hosts the AI agents. No Express or Node.js backend required.
This project is licensed under the MIT License - see the LICENSE file for details.











