pytorch-optimizer v2.1.1
Change Log
Feature
- Support
gradient centralizationforAdaioptimizer - Support
AdamD debiasforAdaPNMoptimizer - Register custom exceptions (e.g. NoSparseGradientError, NoClosureError, ...)
Documentation
- Add API documentation
Bug
- Fix
SAMoptimizer