Releases: kozistr/pytorch_optimizer
Releases · kozistr/pytorch_optimizer
pytorch-optimizer v0.0.9
Implement DiffRGrad optimizer
pytorch-optimizer v0.0.8
Implement DiffGrad optimizer
pytorch-optimizer v0.0.7
Improve MADGRAD optimizer
pytorch-optimizer v0.0.6
- Implement Sharpness-Aware Minimization (SAM) optimizer
- Support Adaptive SAM
pytorch-optimizer v0.0.5
- Combine
AdaBoundWintoAdaBoundoptimizer withweight_decoupleparameter. - Implement
AdaBeliefoptimizer- Support fp16
- Support
weight_decouplewithAdamWscheme - Support rectified update similar to
RAdam
pytorch-optimizer v0.0.4
Implement AdaBound/AdaBoundW optimizers
pytorch-optimizer v0.0.3
Implement AdaHessian optimizer
Release v0.0.2
add MADGRAD optimizer
Release v0.0.1
initial release