pytorch-optimizer v2.10.0
Change Log
Feature
- Implement Amos optimizer (#174)
- Implement SignSGD optimizer (#176)
- Implement AdaHessian optimizer (#176)
- Implement SophiaH optimizer (#173, #176)
- Implement re-usable functions to compute hessian in
BaseOptimizer(#176, #177)- two types of distribution are supported (
Gaussian,Rademacher).
- two types of distribution are supported (
- Support
AdamDfeature for AdaHessian optimizer (#177)
Diff
Contributions
thanks to @i404788