Personal project implementing the original ADAM optimization algorithm from scratch, as described in the original paper.
This project replicates the ADAM optimizer algorithm exactly as presented in the original research paper. It demonstrates gradient-based optimization on a linear regression problem with animated visualization showing the learning process.
- Full ADAM optimizer implementation from the original paper
- Manual gradient computation for linear regression
- Real-time animated visualization of model convergence
- Educational code structure for learning optimization algorithms
poetry install
poetry run python src/adam_optimizer.py
The script generates:
- Console output showing training progress
- Animated visualization of the model learning process
- GIF saved to
.github/adam_optimization.gif
- Python 3.13+
- Poetry