A web application that accepts CSV exports from MyFitnessPal, processes the data, stores it in PostgreSQL, and visualizes key metrics through Grafana dashboards.
This project simplifies data archival and insight generation for MyFitnessPal users by providing:
- A web interface to upload CSV exports
- Data validation and transformation
- Secure PostgreSQL storage
- Interactive Grafana dashboards for visualization
- CSV Upload: Simple web form to upload MyFitnessPal CSV exports
- Data Processing: Backend validation and transformation of nutrition data
- PostgreSQL Storage: Optimized schema for time-series fitness data
- Grafana Integration: Pre-configured dashboards for calories, macronutrients, and exercise analysis
- Containerized: Easy deployment with Docker Compose
- Backend: Python with Flask
- Database: PostgreSQL
- Data Processing: Pandas and SQLAlchemy
- Frontend: HTML, Bootstrap, and JavaScript
- Visualization: Grafana
- Containerization: Docker and Docker Compose
- Docker and Docker Compose
- MyFitnessPal account with CSV export capability
git clone https://github.com/yourusername/myfitnessapp.git
cd myfitnessappYou can modify the environment variables in docker-compose.yml if needed.
docker-compose up -dThis will start three containers:
- Flask application (port 5000)
- PostgreSQL database (port 5432)
- Grafana (port 3000)
- Web Interface: http://localhost:5000
- Grafana Dashboards: http://localhost:3001 (login with admin/admin)
- Export your nutrition data from MyFitnessPal as CSV
- Visit the web interface at http://localhost:5000
- Upload output CSV file
- After successful upload, view your data in Grafana
Please select 1-09-2014 to 31-09-2014 to see visualizations for September 2014. Change users to see different users data.
The application includes pre-configured dashboards for:
- Daily caloric intake
- Macronutrient breakdown (carbs, protein, fat)
- Nutritional trends over time
This project includes a complete CI/CD pipeline using GitHub Actions:
- Automated Testing: All tests run on push and pull requests to the main branch
- Code Quality: Linting with flake8 ensures code standards
- Database Integration: Tests run against a PostgreSQL service container
- Docker Build: Automatically builds Docker images on successful code merge
- Docker Registry: Images are pushed to Docker Hub registry
- Deployment Ready: Pipeline prepared for extension to your preferred hosting platform
To enable the Docker Hub integration, add these secrets to your GitHub repository:
DOCKERHUB_USERNAME: Your Docker Hub usernameDOCKERHUB_TOKEN: Your Docker Hub access token
# Install test dependencies
pip install pytest pytest-cov flake8
# Run linting
flake8 backend/ --count --select=E9,F63,F7,F82 --show-source --statistics --exclude=venv,backend/venv,__pycache__,site-packages
# Run tests
pytest backend/ --cov=backend