that demonstrates the power of on-device, offline, multimodal AI using the flutter_gemma package. This app Empowering the visually impaired, allergy sufferers, and Alzheimer's patients with real-time, on-device AI assistance from a local Gemma-3n entirely offline.
This project was built as a proof-of-concept for my Kaggle Writeup, which details the entire journey and implementation. gemaura-a-flutter-multi-assistance-health-app-Writeup
- Offline First: The entire AI interaction happens on-device, with no internet connection required.
- Multimodal Input: Understands audio, text, and images, with significantly enhanced video understanding.
- Powered by Gemma 3N: Utilizes Google's lightweight and powerful Gemma 3N model.
- Blind assist: vital tool for the visually impaired. Users can simply open the camera and, using voice commands
- Allergy Checker: Helps users make informed and safe decisions about their food. By scanning a product's label take picture and detect aganist potential allergens
- Alzheimer's Helper: recalling important memories and daily information. Through natural conversation
To run this project locally, follow these steps:
- Download the latest APK from GitHub Releases and transfer it to your Android device.
- flutter_gemma: The core package for running Gemma models on-device.
- LiteRT: Lightweight runtime for optimized model execution.
- LLM Inference API: Powering on-device Large Language Models.
- Hugging Face Integration: For model discovery and download.



