AI-powered running gait analysis application built with Next.js, TensorFlow.js, and Claude AI.
⚠️ EXPERIMENTAL SOFTWARE DISCLAIMER
This is experimental software under active development and is provided "as-is" without warranty of any kind. It is not production-ready and should be used for research, educational, or experimental purposes only. Do not rely on this software for medical decisions or professional gait analysis. Always consult qualified healthcare professionals for medical advice.
GaitKeeper_Demo_Video.mp4
- Video Upload: Upload running videos (treadmill or outdoor) for analysis
- Advanced Pose Estimation: MediaPipe BlazePose with 33 keypoints including detailed foot tracking (heel/toe)
- Comprehensive Gait Metrics: Cadence, stride length, foot strike pattern, foot strike position, joint angles, and estimated loads
- Kalman Filter Tracking: Smooth ankle tracking with velocity-based foot-strike detection
- Real-time Foot-Strike Visualization: Animated arrows showing exact foot contact points during landing
- AI Suggestions: Get personalized form improvement recommendations from Claude AI
- Interactive Chat: Ask questions about your gait analysis and get expert advice
- Smooth 60 FPS Playback: Interpolated pose rendering for fluid visualization
- Framework: Next.js 14 (App Router)
- UI: React, TailwindCSS, shadcn/ui
- Pose Detection: MediaPipe BlazePose (TensorFlow.js runtime with automatic MoveNet fallback)
- Motion Tracking: Custom Kalman Filter implementation for ankle trajectory smoothing
- Database: Prisma ORM with SQLite (local) / PostgreSQL (production)
- AI: Anthropic Claude API
- Storage: Local filesystem (dev) / Vercel Blob (production)
- Node.js 18+ and npm
- Anthropic API key (get one at https://console.anthropic.com/)
-
Clone the repository (if not already done):
git clone <your-repo-url> cd GaitKeeper
-
Install dependencies:
npm install
-
Set up environment variables: Create a
.envfile in the root directory:cp .env.example .env
Edit
.envand add your Anthropic API key:DATABASE_URL="file:./dev.db" ANTHROPIC_API_KEY="your_anthropic_api_key_here" -
Initialize the database:
npx prisma generate npx prisma db push
-
Run the development server:
npm run dev
-
Open your browser: Navigate to http://localhost:3000
To get started quickly with a sample running video:
-
Download a sample video from Pexels:
- Visit: https://www.pexels.com/search/videos/running/
- Search for "running side view" or "treadmill running"
- Download a ~10 second video
-
Place the video:
mkdir -p public/uploads/videos mv ~/Downloads/your-video.mp4 public/uploads/videos/sample-running.mp4 -
Add to database:
npm run add-sample
-
Alternatively
- Take your captured or downloaded video and manually add it using the "Upload Video" button on the home page.
See SAMPLE_VIDEO_GUIDE.md for detailed instructions and recommended videos.
- Click the "Upload Video" button on the home page
- Enter a name for your video
- Select a video file (MP4, MOV, etc.) - ~10 seconds recommended
- Click "Upload"
- Click on a video thumbnail to open the analysis page
- Click "Analyze Gait" to start pose estimation
- Wait for the analysis to complete (may take 1-2 minutes)
- View metrics as you scrub through the video
- After analysis is complete, go to the "Suggestions" tab
- Click "Generate Suggestions" to get AI-powered recommendations
- Review suggestions categorized by priority (high, medium, low)
- Go to the "Chat" tab
- Ask questions like:
- "Why is my cadence important?"
- "How can I reduce knee impact?"
- "What does my foot strike pattern mean?"
MediaPipe BlazePose Integration:
- 33 Keypoints: Full body tracking including detailed foot landmarks (heel, ankle, toe)
- MediaPipe Runtime: Uses Google's production-grade WebAssembly implementation for stability
- Automatic Fallback: Gracefully falls back to MoveNet (17 keypoints) if BlazePose fails
- Model-Agnostic Design: Dynamic keypoint mapping adapts to whichever model is loaded
Kalman Filter Foot-Strike Detection:
- State Tracking: Tracks ankle position (x, y) and velocity (vx, vy) in real-time
- Velocity-Based Detection: Distinguishes landing (downward deceleration) from toe-off (upward acceleration)
- Noise Filtering: Smooths out pose detection jitter for stable tracking
- Ground Level Estimation: Automatically calibrates ground plane from ankle positions
Visual Feedback:
- Animated Arrows: Fade-in/fade-out arrows at foot-strike locations (300ms duration)
- Precise Positioning: Uses actual heel and toe keypoints to show midfoot contact point
- Color Coding: Amber for left foot, red for right foot
- 60 FPS Rendering: Smooth interpolation between detected poses for fluid playback
Comprehensive Analysis:
- Cadence: Steps per minute calculated from ankle vertical motion patterns
- Stride Length: Distance covered per stride using body height calibration
- Foot Strike Pattern: Heel/midfoot/forefoot classification using heel-toe angle analysis
- Foot Strike Position: Landing position relative to center of gravity
- Estimated Loads: Impact forces on major joints
Adaptive Processing:
- All metrics automatically adapt to available keypoints (BlazePose vs MoveNet)
- Graceful degradation when heel/toe keypoints unavailable
- Confidence scoring for each metric based on keypoint visibility
GaitKeeper/
├── app/ # Next.js app directory
│ ├── api/ # API routes
│ │ ├── chat/ # Chat endpoint
│ │ └── videos/ # Video upload & analysis
│ ├── analysis/[id]/ # Analysis page
│ ├── layout.tsx # Root layout
│ ├── page.tsx # Home page
│ └── globals.css # Global styles
├── components/ # React components
│ ├── ui/ # shadcn/ui components
│ ├── analysis-view.tsx # Main analysis view
│ ├── chat-interface.tsx # Chat component
│ ├── metrics-panel.tsx # Metrics display
│ ├── upload-dialog.tsx # Upload modal
│ ├── video-list.tsx # Video grid
│ └── video-player.tsx # Video player with pose overlay
├── lib/ # Utilities and libraries
│ ├── ai.ts # Claude AI integration
│ ├── db.ts # Prisma client
│ ├── kalman-filter.ts # 2D Kalman filter for ankle tracking
│ ├── metrics.ts # Gait metrics calculation
│ ├── pose-estimation.ts # MediaPipe BlazePose integration
│ ├── storage.ts # Storage abstraction layer
│ └── utils.ts # Helper functions
├── prisma/ # Database schema
│ └── schema.prisma # Prisma schema definition
└── public/ # Static files
└── uploads/ # Uploaded videos (gitignored)
- Video: Stores video metadata and file paths
- Analysis: Stores pose data and calculated metrics
- Suggestion: AI-generated improvement suggestions
- ChatMessage: Chat conversation history
-
Push your code to GitHub
-
Create a new project on Vercel:
- Import your GitHub repository
- Vercel will auto-detect Next.js
-
Add environment variables:
ANTHROPIC_API_KEY: Your Claude API keyDATABASE_URL: PostgreSQL connection string (use Vercel Postgres)STORAGE_TYPE: Set tovercel-blobBLOB_READ_WRITE_TOKEN: Your Vercel Blob token
-
Update storage adapter (if using Vercel Blob):
- Implement
VercelBlobStorageAdapterinlib/storage.ts - Install
@vercel/blobpackage
- Implement
-
Deploy:
vercel --prod
# Switch to PostgreSQL in production
# Update DATABASE_URL in .env
npx prisma db push- User authentication (NextAuth.js)
- Video comparison (compare multiple runs)
- Progress tracking over time
- Export analysis reports (PDF)
- Mobile app support
- Real-time video capture
- Coach/athlete collaboration tools
Contributions are welcome! Please feel free to submit a Pull Request.
See LICENSE file for details.
- Google MediaPipe team for BlazePose model
- TensorFlow.js team for the runtime and MoveNet fallback model
- Anthropic for Claude AI API
- shadcn/ui for beautiful UI components