pytorch-optimizer v3.4.1
Change Log
Feature
- Support
GCSAMoptimizer. (#343, #344)- Gradient Centralized Sharpness Aware Minimization
- you can use it from
SAMoptimizer by settinguse_gc=True.
- Support
LookSAMoptimizer. (#343, #344)
Update
- Support alternative precision training for
Shampoooptimizer. (#339) - Add more features to and tune
Ranger25optimizer. (#340)AGC+Lookaheadvariants- change default beta1, beta2 to 0.95 and 0.98 respectively
- Skip adding
Lookaheadwrapper in case ofRanger*optimizers, which already have it increate_optimizer(). (#340) - Improved optimizer visualization. (#345)
- Rename
pytorch_optimizer.optimizer.gctopytorch_optimizer.optimizer.gradient_centralizationto avoid possible conflict with Python built-in functiongc. (#349)
Bug
Docs
- Update the visualizations. (#340)
Contributions
thanks to @AidinHamedi