AI-powered resume analysis platform designed to help candidates optimize their resumes for Applicant Tracking Systems (ATS) and recruiter expectations.
Built with modern SaaS architecture using Supabase, AI integrations, and secure database design.
- 🔐 Supabase Authentication (JWT-based)
- 📄 Resume Upload (PDF/DOCX)
- 🗂 Versioned Resume System
- 🤖 AI-Powered Resume Analysis
- 📊 Structured ATS Scoring
- 💳 Stripe Subscription Support (planned / in progress)
- 💰 Usage & Credit Tracking
- 🔒 Row-Level Security (RLS) enforced
- 🧪 Unit Testing with Coverage Enforcement
- ☁️ SonarCloud Quality Gate CI
Frontend (React + TanStack Router)
↓
Supabase Auth + Storage
↓
Supabase Postgres (RLS enforced)
↓
Backend API (AI + Billing Logic)
↓
OpenAI / Stripe
- No direct DB exposure
- RLS on every table
- Backend handles billing + AI secrets
- Deterministic CI builds
- Production-grade schema constraints
- React
- TanStack Router
- TypeScript
- Vitest (unit testing)
- Node.js (planned modular service layer)
- Supabase (Postgres + Auth + Storage)
- OpenAI (AI analysis)
- Stripe (billing)
- Supabase
- GitHub Actions
- SonarCloud
git clone https://github.com/frank-mendez/ResumeIQ
cd ResumeIQ
cp .env.example .env
npm install
npm run devMake sure environment variables are configured before running.
Create a .env file:
# Supabase (Frontend)
VITE_SUPABASE_URL=
VITE_SUPABASE_ANON_KEY=
# Backend Only
SUPABASE_SERVICE_ROLE_KEY=
OPENAI_API_KEY=
STRIPE_SECRET_KEY=
STRIPE_WEBHOOK_SECRET=⚠ Never expose SUPABASE_SERVICE_ROLE_KEY to the frontend.
- Create a Supabase project.
- Enable Row Level Security on all tables.
- Create a storage bucket named:
resumes- Apply migrations from
/supabase/migrations. - Ensure RLS policies are enabled.
Core tables:
profilesresumesresume_versionsresume_analysesuser_creditssubscriptionspaymentsusage_logs
- Foreign keys are NOT NULL
- Stripe IDs are unique
- One active version per resume
- Soft delete support
- AI analysis status tracking
- Credit-based usage control
- Strict RLS enforcement
- Validate file type (PDF/DOCX)
- Validate file size (max 5MB)
- Upload to Supabase Storage
- Insert metadata in
resumes - Create initial
resume_version - Trigger AI processing (backend)
Storage path format:
user_id/resume_id/original_filenameFrontend → Backend → OpenAI → Store Analysis → Deduct Credits
- Status tracked (
pending,processing,completed,failed) - Token usage logged
- Cost tracking supported
- Credits updated transactionally
- Stripe subscriptions
- Payment intents logged
- Status validation enforced
- Financial tables backend-controlled
- Users can view but not mutate financial records
Run tests:
npm run testRun with coverage:
npm run test:coverageCoverage is required for CI and SonarCloud Quality Gate. Pre-commit also enforces coverage thresholds.
CI runs:
- Lint
- Unit tests
- Coverage
- Build
- SonarCloud analysis
Quality Gate enforces:
- Coverage on new code
- No critical vulnerabilities
- Maintainability standards
- RLS enabled on all user-owned tables
- Financial and credit tables are backend-write only
- No service role key exposed client-side
- Strict storage path isolation
- Stripe webhook validation required
Ready for:
- 10k+ users
- Credit-based monetization
- AI cost tracking
- Admin dashboards
- Subscription tiers
- Resume parsing improvements
- Multi-language resume support
- Team / organization accounts
- AI resume rewriting
- Interview prep module
MIT License
Built by Frank Mendez.