Skip to content

Releases: kozistr/pytorch_optimizer

pytorch-optimizer v0.0.9

23 Sep 10:32
bc22b00

Choose a tag to compare

Implement DiffRGrad optimizer

pytorch-optimizer v0.0.8

23 Sep 06:40
5113c54

Choose a tag to compare

Implement DiffGrad optimizer

pytorch-optimizer v0.0.7

22 Sep 13:52

Choose a tag to compare

Improve MADGRAD optimizer

pytorch-optimizer v0.0.6

22 Sep 08:28
8ef27c3

Choose a tag to compare

  • Implement Sharpness-Aware Minimization (SAM) optimizer
    • Support Adaptive SAM

pytorch-optimizer v0.0.5

22 Sep 07:42
3c7a89f

Choose a tag to compare

  • Combine AdaBoundW into AdaBound optimizer with weight_decouple parameter.
  • Implement AdaBelief optimizer
    • Support fp16
    • Support weight_decouple with AdamW scheme
    • Support rectified update similar to RAdam

pytorch-optimizer v0.0.4

22 Sep 06:59
278c29e

Choose a tag to compare

Implement AdaBound/AdaBoundW optimizers

pytorch-optimizer v0.0.3

22 Sep 05:58
29a9dd3

Choose a tag to compare

Implement AdaHessian optimizer

Release v0.0.2

21 Sep 16:09

Choose a tag to compare

add MADGRAD optimizer

Release v0.0.1

21 Sep 13:52

Choose a tag to compare

initial release