diff --git a/README.md b/README.md index 9113ce7..bd0153a 100644 --- a/README.md +++ b/README.md @@ -4,6 +4,21 @@ Emerging Optimizers is a research project focused on understanding and optimizing the algorithmic behavior of Shampoo-class optimizers (Shampoo, SOAP, Muon, etc.) and their implications for performance and GPU systems in large language model training. +## Installation + +### Prerequisites + +- Python 3.12 or higher +- PyTorch 2.0 or higher + +### Install from Source + +```bash +git clone https://github.com/NVIDIA-NeMo/Emerging-Optimizers.git +cd Emerging-Optimizers +pip install . +``` + ## User guide Coming soon.