The debug-apk and the release-apk is in the "releases" in our repo. To download it without any key, install the debug-apk. For production, we use the release apk.
Made by- Team Astro Bugs (SRM Institute of Science and Technology, KTR)
- Video Demo URL: Youtube Video- Live App Demo
- PPT Report: AuraFit_AstroBugs.pptx
- Supplementary Comprehensive Report: AuraFit_AstroBugs.docx
Platform: Android (Kotlin)
- Overview
- Core Features
- System Architecture
- Tech Stack
- Project Structure
- Setup and Installation
- System Requirements
- Engineering Decisions & Bottlenecks
- Team Contribution
AuraFit addresses a critical gap in the at-home fitness market: the lack of accessible, private, and intelligent coaching. While many apps track workouts, they fail to provide the real-time, expert feedback necessary for safe and effective exercise. Existing AI solutions often compromise user privacy by streaming sensitive camera data to the cloud.
AuraFit is a revolutionary Android application, built entirely in Kotlin, that transforms a smartphone and a Samsung Galaxy Watch into a seamless, on-device AI coaching system. By fusing real-time computer vision with live biometric data, AuraFit delivers an unparalleled, expert-level training experience with a 100% privacy-first guarantee.
-
Multi-Modal Form Coaching
Real-time form analysis for multiple exercises (e.g., push-ups, squats), enriched with live heart rate data streamed directly from the Galaxy Watch. -
Automated Rep Counting
Hands-free, accurate repetition counting using a state-machine logic built on body angle detection. -
Data-Driven Dashboard
A personalized home screen showing live daily steps, calories (via Samsung Health), and dynamic daily challenges with progress toward weekly goals. -
Hands-Free Voice Feedback
AI-powered corrections and motivational cues delivered through Text-to-Speech so users can stay focused on training. -
Persistent Workout History
Completed sessions are securely stored in an on-device Room database, enabling a detailed history view and profile with lifetime statistics. -
Login-Free Personalization
Greets users by name using Android’sContactsContract, delivering a personalized experience without requiring account creation. -
Multi-Model AI Model Powered by
Gemma3b (1.3gb)fully on-device LLM and a smart engine fallback ensuring 100 percent uptime and on-device privacy and security of data. -
Daily Updates Keeps Calendar based updates of every data daily. Serving as a personal workout diary.
AuraFit is built as a high-performance, real-time data fusion system, optimized for on-device execution to guarantee privacy and offline reliability. At its core is our Hybrid AI Engine — a dual-model architecture that ensures maximum intelligence while guaranteeing 100% operational uptime, even under strict mobile memory constraints.
-
Parallel Inputs
- Camera Feed: Processed by Google’s MediaPipe Pose Landmarker, extracting 33 skeletal keypoints per frame in real time.
- Biometric Feed: Handled by a custom
HealthDataManagerusing Samsung Health Data SDK for live heart rate monitoring.
-
On-Device Processing
Both data streams are fused within the Hybrid AI Engine, enabling synchronized physical + biometric analysis without external servers.
-
Reflex Engine (Rule-Based AI)
A lightweight, always-on engine running on every frame. It performs instantaneous, critical tasks such as rep counting and injury-prevention form alerts. Its efficiency ensures continuous operation regardless of system load. -
Cognitive Engine ("Plug-and-Play" LLM Slot)
A deeper reasoning layer designed for extensibility. Any on-device generative model can be integrated. For our prototype, we deployed a 1.3GB Gemma LLM with a crash-proof pause/resume protocol: the camera stream briefly freezes to release memory, then resumes seamlessly once inference is complete. -
Smart Fallback (100% Uptime Guarantee)
If the Cognitive Engine cannot load due to memory limits, the app gracefully defaults to the Reflex Engine without interruption. This dual-AI architecture ensures the coaching experience remains intact under all conditions — no crashes, no loss of functionality.
Actionable feedback is delivered through two synchronized channels:
- On-Screen Visuals for clarity.
- Hands-Free Voice Coaching via Android’s native Text-to-Speech for uninterrupted workouts.
-
Platform:
Android (Kotlin-first), ensuring modern language features, strong tooling support, and seamless integration with Jetpack libraries. -
Vision AI Engine:
Google MediaPipe Pose Landmarker (tasks-vision:0.10.9) for real-time, on-device body keypoint detection (33 landmarks per frame). -
Wearable Integration:
Samsung Health Data SDK (samsung-health-data-api-1.0.0.aar) to stream live biometric signals (e.g., heart rate) from Galaxy Watch devices. -
Cognitive AI Engine:
Gemma LLM (1.3GB, on-device) integrated as the reasoning layer within the Hybrid AI Engine, using a custom pause/resume protocol to handle memory constraints without crashes. -
Data Persistence:
Android Room Persistence Library for secure, structured, and offline-ready storage of workout history. -
UI & Architecture:
Android Jetpack components (Fragments, Navigation) combined with View Binding and Coroutines for clean, responsive, and maintainable UI workflows. -
Voice Output:
Android Native Text-to-Speech (TTS) Engine for natural, hands-free coaching feedback.
The AuraFit codebase is organized into a modular and scalable structure, separating core AI logic, UI, data management, and resources for maintainability.
samsung_genai/
├── app/
│ ├── build.gradle # App-level Gradle configuration
│ ├── libs/
│ │ └── samsung-health-data-api-1.0.0.aar # Samsung Health SDK
│ ├── local.properties
│ ├── proguard-rules.pro
│ ├── download_tasks.gradle
│ └── src/
│ ├── androidTest/ # Instrumented tests
│ ├── test/ # Unit tests
│ └── main/
│ ├── AndroidManifest.xml
│ ├── assets/
│ │ └── pose_landmarker_full.task # MediaPipe model
│ ├── jniLibs/ # Native libraries (if any)
│ ├── java/
│ │ └── com/google/mediapipe/examples/poselandmarker/
│ │ ├── data/ # Room DB entities, DAO, Database
│ │ │ ├── AppDatabase.kt
│ │ │ ├── Challenge.kt
│ │ │ ├── WorkoutDao.kt
│ │ │ └── WorkoutSession.kt
│ │ ├── fragment/ # UI Fragments
│ │ │ ├── CameraFragment.kt
│ │ │ ├── DashboardFragment.kt
│ │ │ ├── PermissionsFragment.kt
│ │ │ ├── ProfileFragment.kt
│ │ │ └── WorkoutsFragment.kt
│ │ ├── HealthDataManager.kt # Samsung Health SDK manager
│ │ ├── HistoryActivity.kt
│ │ ├── HistoryAdapter.kt
│ │ ├── MainActivity.kt
│ │ ├── MainViewModel.kt
│ │ ├── OverlayView.kt
│ │ ├── PoseLandmarkerHelper.kt
│ │ ├── SplashActivity.kt
│ │ └── WorkoutActivity.kt
│ └── res/
│ ├── color/
│ │ ├── bg_nav_item.xml
│ │ └── bottom_nav_color.xml
│ ├── drawable/ # Images, icons, shapes
│ ├── layout/ # XML layout files
│ │ ├── activity_history.xml
│ │ ├── activity_main.xml
│ │ ├── activity_splash.xml
│ │ ├── activity_workout.xml
│ │ ├── calendar_day.xml
│ │ ├── fragment_camera.xml
│ │ ├── fragment_dashboard.xml
│ │ ├── fragment_profile.xml
│ │ ├── fragment_workouts.xml
│ │ └── item_history_entry.xml
│ ├── menu/
│ │ ├── bottom_nav_menu.xml
│ │ └── menu_bottom_nav.xml
│ ├── mipmap-hdpi/
│ ├── mipmap-mdpi/
│ ├── mipmap-xhdpi/
│ ├── mipmap-xxhdpi/
│ ├── mipmap-xxxhdpi/
│ ├── navigation/ # Navigation graphs
│ └── values/ # Colors, strings, dimens
├── build/ # Gradle build outputs
├── gradle/
├── gradle.properties
├── gradlew*
├── gradlew.bat
├── local.properties
├── pose_landmarker.png
├── settings.gradle
└── build.gradle # Project-level Gradle configuration
- Modular Java/Kotlin code under
com.google.mediapipe.examples.poselandmarker/separates:data/→ Room databasefragment/→ UI screens- Core activities and helpers (
MainActivity.kt,PoseLandmarkerHelper.kt,HealthDataManager.kt)
- Assets & Models:
- MediaPipe model (
pose_landmarker_full.task) inassets/
- MediaPipe model (
- Resources:
- Layouts, menus, colors, drawables, navigation graphs under
res/
- Layouts, menus, colors, drawables, navigation graphs under
- External Libraries:
- Samsung Health SDK in
app/libs/
- Samsung Health SDK in
- Branching Note:
- Samsung Health integration is implemented in the
watchbranch
- Samsung Health integration is implemented in the
This version is hierarchical, readable, and clearly separates modules, making it perfect for a README.
- Android Studio Iguana (or newer)
- Physical Android device running API 29 (Android 10) or higher
- Samsung Galaxy Watch + Samsung Health app (required for full biometric functionality)
-
Clone the repository
git clone https://github.com/dagaayush1205/Samsung-GENAI-Hackathon.git cd Samsung-GENAI-Hackathon -
Open the Project Launch Android Studio and open the cloned project directory.
-
Add Samsung Health SDK Ensure
samsung-health-data-api-1.0.0.aaris placed in theapp/libs/directory. -
Sync Gradle Dependencies Allow Android Studio to complete the Gradle sync.
-
Run the App Connect your physical Android device and click Run
▶️ in Android Studio.
On first launch, allow access to:
- Camera
- Contacts
- Samsung Health
-
Gradle JDK:
The project is configured to build with JVM 17.
Ensure your Android Studio’s Gradle JDK is set to JDK 17 under:
File > Settings > Build, Execution, Deployment > Build Tools > Gradle
👉 Download OpenJDK 17 -
Target Device:
A physical Android device running API 29 (Android 10) or higher.
(Required by Samsung Health SDK for biometric integration.) -
Samsung Health SDK:
Download the Samsung Health Data SDK (samsung-health-data-api-1.0.0.aar) here:
🔗 Samsung Health Data SDK -
Cognitive AI Models (Optional for Cognitive Engine):
AuraFit supports plug-and-play on-device LLMs. For our prototype, we integrated Gemma 1.3B.
During development, we faced several significant technical challenges. Our solutions to these problems became a core part of AuraFit’s innovation.
- Challenge: Starting from scratch would have meant weeks of boilerplate work configuring CameraX + MediaPipe.
- Decision: We chose to build directly on the official MediaPipe Pose Landmarker sample, performing extensive in-place refactoring.
- Impact: This gave us a stable, production-ready camera + pose pipeline from day one, allowing the team to focus on our novel contributions (UI, database, AI logic, wearable SDK integration). Development velocity increased dramatically.
- Problem: Samsung provides multiple SDKs (Data, Sensor, Accessory). Our initial attempt with the Sensor SDK led to build failures, unstable runtime behavior, and inconsistent API support.
- Solution: We pivoted to the Samsung Health Data SDK, which is robust and officially documented.
- Built a modern Kotlin-first
HealthDataManager.ktwrapper based on Samsung’s legacy Java examples. - Implementation is currently maintained in the dedicated
watchbranch (notmain). - Achieved stable, reliable fetching of both historical stats and live heart rate data from the Galaxy Watch.
- Built a modern Kotlin-first
- Outcome: A clean, extensible integration that future-proofs wearable data collection.
- Ambition: We aimed to integrate Google’s Gemma 2B as an on-device LLM for generative workout feedback.
- Problem: Running the 1.3GB model concurrently with a live camera feed caused native memory crashes (SIGSEGV) on typical Android devices.
- Solution:
- Engineered a Hybrid AI architecture with a pause/resume protocol: the camera stream temporarily freezes to release memory, allowing the LLM to run safely.
- This worked. We tested it with the Gemma3b Model and also created a smart fallback Comprehensive Rule Based engine for 100% uptime.
- Outcome: A crash-proof Cognitive Engine design that guarantees 100% uptime today, while paving the way for true on-device generative AI tomorrow.
Key Takeaway:
AuraFit’s biggest innovations came directly from solving these bottlenecks:
- Leveraging stable foundations (MediaPipe sample refactoring)
- Building clean integrations (Samsung Health Data SDK manager)
- Pioneering crash-proof AI architectures (Hybrid Reflex + Cognitive engines with smart fallback)
Our team's success was driven by a clear division of roles, allowing for rapid, parallel development of the app's complex components.
- Pavithra CP- Engineered Core Application Structure and the Hybrid AI integration. Implemented pose landmarker from media pipe.
- Ayush Daga- Integration of Samsung SDK. Worked in the
watchbranch to bring on the wearable and biometric support. Managed complex permission and connection lifestyle. - Nikhil CP- Worked with improving the UI and the User Experience. Improved the xml files and the fragments to bring an immersive experience. Implemented the Room Database for workout history and Persistence.
- Dhruv Gupta- Helped with R&D and final integration and presentation. Vital role in testing, onboarding and error debugging for a flawless project.
Made with Love <3 Team Astro Bugs (SRM IST, KTR)
