A comprehensive AI choreographer application that allows users to upload music files and generate realistic dance choreography with interactive 3D visualization.
- Single File Processing: Upload individual music files for personalized dance generation
- EDGE Model Integration: Uses state-of-the-art EDGE (Editable Dance GEneration) AI model
- Multiple Audio Formats: Supports WAV, MP3, FLAC, and M4A files
- Automatic Audio Processing: Converts audio to appropriate format for AI processing
- Interactive 3D Viewer: Full 3D avatar with interactive camera controls
- SMPL-to-FBX Conversion: Converts motion data to industry-standard FBX format
- Multiple View Modes: 2D preview and full 3D interactive modes
- Camera Controls: Orbit, zoom, pan controls with preset camera angles
- Avatar Customization: Wireframe toggle, opacity control, and show/hide options
- Synchronized Playback: 2D and 3D viewers sync with audio playback
- Speed Control: Multiple playback speeds from 0.25x to 2x
- Timeline Scrubbing: Interactive timeline with precise control
- Mirror Mode: Toggle for practice and learning
- Python Backend: Flask-based API server for dance generation
- Frontend Integration: Modern web interface with real-time progress tracking
- Database Integration: Supabase for user management and project storage
- File Management: Automatic file upload, processing, and download
- Python 3.8+ with virtual environment support
- Node.js 16+ with npm
- ffmpeg (for audio conversion)
- Git
-
Clone and navigate to the project:
cd /home/user/ChoreoAI -
Run the setup script:
./setup.sh
-
Start the application:
./start.sh
-
Access the application:
- Frontend: http://localhost:8080
- Backend API: http://localhost:5000
-
Backend Setup:
cd external python3 -m venv .venv source .venv/bin/activate pip install -r requirements.txt cd ..
-
Frontend Setup:
npm install
-
Start Services:
# Terminal 1: Backend python3 dance_server.py # Terminal 2: Frontend npm start
ChoreoAI/
βββ index.html # Main web interface
βββ script.js # Frontend JavaScript logic
βββ styles.css # UI styling and animations
βββ threejs-viewer.js # 3D visualization component
βββ supabase-service.js # Database and auth service
βββ dance_server.py # Python backend API server
βββ requirements.txt # Python dependencies
βββ setup.sh # Automated setup script
βββ start.sh # Application launcher
βββ external/ # AI model and processing
βββ single_music_generator.py # Single file dance generation
βββ EDGE.py # EDGE model implementation
βββ test.py # Original batch processing
βββ checkpoint.pt # Pre-trained model weights
βββ SMPL-to-FBX/ # 3D avatar conversion
βββ Convert.py # SMPL to FBX converter
βββ ybot.fbx # Base avatar model
- User Registration/Login: Create account or sign in
- Music Upload: Drag & drop or browse to upload music file
- Style Configuration: Choose dance style and preferences
- Generation: AI processes music and generates dance motion
- 3D Visualization: View and interact with 3D avatar performing the dance
- Export: Download generated video, motion data, or FBX files
POST /api/upload- Upload music filePOST /api/generate- Start dance generationGET /api/status/<id>- Check generation progressGET /api/download/<id>/<type>- Download generated filesDELETE /api/cleanup/<id>- Clean up generation filesGET /api/health- Health check
- Cyberpunk Aesthetic: Dark theme with neon accents
- Responsive Design: Works on desktop and mobile
- Real-time Updates: Live progress tracking and notifications
- Interactive Elements: Smooth animations and hover effects
- Mouse Controls: Orbit (left-click drag), Zoom (scroll), Pan (right-click drag)
- Camera Presets: Front, Side, 3D, Top view buttons
- Avatar Controls: Show/hide, wireframe mode, opacity slider
- Reset View: One-click camera reset
- Architecture: Transformer-based dance generation
- Input: Audio features (Jukebox or baseline)
- Output: SMPL motion parameters
- Training: Trained on AIST++ dance dataset
- Motion Generation: EDGE model creates motion sequences
- SMPL Processing: Motion data in SMPL format
- FBX Conversion: Convert to industry-standard FBX
- 3D Rendering: Three.js-based real-time rendering
- Audio Conversion: Automatic format conversion via ffmpeg
- Motion Export: SMPL parameters saved as .pkl files
- 3D Export: FBX files for use in external 3D software
- Video Rendering: MP4 output with visual representation
- Backend: Extend
dance_server.pywith new endpoints - Frontend: Modify
script.jsfor new UI features - 3D Viewer: Update
threejs-viewer.jsfor visualization enhancements
- Model Settings: Adjust parameters in
single_music_generator.py - UI Themes: Modify CSS variables in
styles.css - API Settings: Update endpoints in
supabase-service.js
- Model Requirements: Requires EDGE checkpoint.pt file in external/ directory
- FBX Dependencies: SMPL-to-FBX conversion requires FBX Python SDK
- Performance: 3D rendering performance depends on device capabilities
- Audio Formats: Some formats may require additional ffmpeg codecs
- Virtual Environment: Ensure .venv is activated for Python commands
- Missing Dependencies: Run setup.sh to install all requirements
- Port Conflicts: Check if ports 5000 and 8080 are available
- FBX Conversion: May require manual FBX SDK installation
- Check console logs for detailed error messages
- Ensure all dependencies are installed correctly
- Verify file permissions for uploaded content
- Multiple Avatar Models: Support for different character types
- Real-time Generation: Faster processing for immediate results
- Dance Style Transfer: Convert between different dance styles
- Collaborative Features: Share and remix generated choreographies
- Mobile App: Native mobile application development
Built with β€οΈ using EDGE AI model, Three.js, Flask, and modern web technologies. 6. Export: Download video, export GLB, or share your creation
The design follows the specified cyberpunk color palette:
- Primary Purple: #5E35B1 (Deep Purple)
- Accent Pink: #FF4081 (Neon Pink)
- Accent Cyan: #00E5FF (Cyan)
- Background: #121212 (Off-Black)
- Card Background: #1E1E1E (Dark Slate Gray)
- Text Primary: #FFFFFF (White)
- Text Secondary: #E0E0E0 (Very Light Gray)
- Text Muted: #B0B0B0 (Gray)
- Spacebar: Play/Pause
- Left Arrow: Rewind 10 seconds
- Right Arrow: Forward 10 seconds
- M: Toggle mirror mode
- Drag & Drop: Drag files directly onto upload areas
- File Browser: Click upload areas to open file picker
- Supported Formats: Audio files for music, video files for video upload
The interface is designed to support future gesture recognition:
- Swipe Left: Rewind
- Swipe Right: Forward
- Palm Up: Pause/Play toggle
- Double Tap: Toggle mirror view
- Modern Browsers: Chrome, Firefox, Safari, Edge (latest versions)
- CSS Features: Uses modern CSS Grid, Flexbox, and animations
- JavaScript: ES6+ features with class-based architecture
- Add new options to the
genreSelectdropdown in HTML - Add corresponding style items to the showcase section
- Update the
selectStyle()method in JavaScript
Update CSS custom properties in :root section of styles.css:
:root {
--primary-purple: #5E35B1;
--accent-pink: #FF4081;
--accent-cyan: #00E5FF;
/* ... other colors */
}The JavaScript is organized in a class-based structure for easy extension:
- Add new methods to the
AIChoreographerclass - Update event listeners in
setupEventListeners() - Add corresponding UI elements in HTML
The frontend is designed to integrate with your AI choreographer API. Key integration points:
- File Upload: Replace file handling with actual API calls
- Generation Process: Connect progress simulation to real API progress
- Result Display: Replace skeleton animations with actual 3D avatar data
- Export Functions: Implement actual download and sharing functionality
- Optimized Animations: Uses CSS transforms and opacity for smooth performance
- Lazy Loading: Intersection Observer for scroll-triggered animations
- Efficient DOM Updates: Minimal DOM manipulation with class-based updates
- Responsive Images: Scalable vector icons and CSS-based graphics
- WebGL Integration: Replace CSS animations with 3D WebGL avatars
- Real-time Collaboration: Multi-user choreography sessions
- Advanced Editing: Timeline-based choreography editing
- Social Features: Share and discover choreographies
- AI Feedback: Real-time movement analysis and suggestions
For questions or issues with the frontend implementation, refer to the code comments or create an issue in your project repository.
Built with modern web technologies and designed for the future of AI-powered choreography.