pytorch-optimizer v3.0.0
Change Log
The major version is updated! (v2.12.0 -> v3.0.0) (#164)
Many optimizers, learning rate schedulers, and objective functions are in pytorch-optimizer.
Currently, pytorch-optimizer supports 67 optimizers (+ bitsandbytes), 11 lr schedulers, and 13 loss functions, and reached about 4 ~ 50K downloads / month (peak is 75K downloads / month)!
The reason for updating the major version from v2 to v3 is that I think it's a good time to ship the recent implementations (the last update was about 7 months ago) and plan to pivot to new concepts like training utilities while maintaining the original features (e.g. optimizers).
Also, rich test cases, benchmarks, and examples are on the list!
Finally, thanks for using the pytorch-optimizer, and feel free to make any requests :)
Feature
- Implement
REXlr scheduler. (#217, #222) - Implement
Aidaoptimizer. (#220, #221) - Implement
WSAMoptimizer. (#213, #216) - Implement
GaLoreoptimizer. (#224, #228) - Implement
Adaliteoptimizer. (#225, #229) - Implement
bSAMoptimizer. (#212, #233) - Implement
Schedule-Freeoptimizer. (#230, #233) - Implement
EMCMC. (#231, #233)
Fix
- Fix SRMM to allow operation beyond memory_length. (#227)
Dependency
- Drop
Python 3.7support officially. (#221)- Please check the README.
- Update
bitsandbytesto0.43.0. (#228)
Docs
- Add missing parameters in
Ranger21 optimizerdocument. (#214, #215) - Fix
WSAMoptimizer paper link. (#219)
Contributions
Diff
- from the previous major version : 2.0.0...3.0.0
- from the previous version: 2.12.0...3.0.0