Releases: kozistr/pytorch_optimizer
pytorch-optimizer v3.0.2
pytorch-optimizer v3.0.1
Change Log
Feature
- Implement
FAdamoptimizer. (#241, #242) - Tweak
AdaFactoroptimizer. (#236, #243)- support not-using-first-momentum when beta1 is not given
- default dtype for first momentum to
bfloat16 - clip second momentum to 0.999
- Implement
GrokFastoptimizer. (#244, #245)
Bug
- Wrong typing of reg_noise. (#239, #240)
- Lookahead`s param_groups attribute is not loaded from checkpoint. (#237, #238)
Contributions
thanks to @michaldyczko
pytorch-optimizer v3.0.0
Change Log
The major version is updated! (v2.12.0 -> v3.0.0) (#164)
Many optimizers, learning rate schedulers, and objective functions are in pytorch-optimizer.
Currently, pytorch-optimizer supports 67 optimizers (+ bitsandbytes), 11 lr schedulers, and 13 loss functions, and reached about 4 ~ 50K downloads / month (peak is 75K downloads / month)!
The reason for updating the major version from v2 to v3 is that I think it's a good time to ship the recent implementations (the last update was about 7 months ago) and plan to pivot to new concepts like training utilities while maintaining the original features (e.g. optimizers).
Also, rich test cases, benchmarks, and examples are on the list!
Finally, thanks for using the pytorch-optimizer, and feel free to make any requests :)
Feature
- Implement
REXlr scheduler. (#217, #222) - Implement
Aidaoptimizer. (#220, #221) - Implement
WSAMoptimizer. (#213, #216) - Implement
GaLoreoptimizer. (#224, #228) - Implement
Adaliteoptimizer. (#225, #229) - Implement
bSAMoptimizer. (#212, #233) - Implement
Schedule-Freeoptimizer. (#230, #233) - Implement
EMCMC. (#231, #233)
Fix
- Fix SRMM to allow operation beyond memory_length. (#227)
Dependency
- Drop
Python 3.7support officially. (#221)- Please check the README.
- Update
bitsandbytesto0.43.0. (#228)
Docs
- Add missing parameters in
Ranger21 optimizerdocument. (#214, #215) - Fix
WSAMoptimizer paper link. (#219)
Contributions
Diff
- from the previous major version : 2.0.0...3.0.0
- from the previous version: 2.12.0...3.0.0
pytorch-optimizer v2.12.0
Change Log
Feature
- Support
bitsandbytesoptimizer. (#211)- now, you can install with
pip3 install pytorch-optimizer[bitsandbytes] - supports 8 bnb optimizers.
bnb_adagrad8bit,bnb_adam8bit,bnb_adamw8bit,bnb_lion8bit,bnb_lamb8bit,bnb_lars8bit,bnb_rmsprop8bit,bnb_sgd8bit.
- now, you can install with
Docs
- Introduce
mkdocswithmaterialtheme. (#204, #206)- documentation : https://pytorch-optimizers.readthedocs.io/en/latest/
Diff
pytorch-optimizer v2.11.2
Change Log
Feature
- Implement DAdaptLion optimizer (#203)
Fix
- Fix Lookahead optimizer (#200, #201, #202)
- When using PyTorch Lightning which expects your optimiser to be a subclass of
Optimizer.
- When using PyTorch Lightning which expects your optimiser to be a subclass of
- Fix default
rectifytoFalseinAdaBeliefoptimizer (#203)
Test
- Add
DynamicLossScalertest case
Docs
- Highlight the code blocks
- Fix pepy badges
Contributions
thanks to @georg-wolflein
Diff
pytorch-optimizer v2.11.1
Change Log
Feature
- Implement Tiger optimizer (#192)
- Implement CAME optimizer (#196)
- Implement loss functions (#198)
- Tversky Loss : Tversky loss function for image segmentation using 3D fully convolutional deep networks
- Focal Tversky Loss
- Lovasz Hinge Loss : The Lovász-Softmax loss: A tractable surrogate for the optimization of the intersection-over-union measure in neural networks
Diff
pytorch-optimizer v2.11.0
Change Log
Feature
- Implement PAdam optimizer (#186)
- Implement LOMO optimizer (#188)
- Implement loss functions (#189)
- BCELoss
- BCEFocalLoss
- FocalLoss : Focal Loss for Dense Object Detection
- FocalCosineLoss : Data-Efficient Deep Learning Method for Image Classification Using Data Augmentation, Focal Cosine Loss, and Ensemble
- DiceLoss : Generalised Dice overlap as a deep learning loss function for highly unbalanced segmentations
- LDAMLoss : Learning Imbalanced Datasets with Label-Distribution-Aware Margin Loss
- JaccardLoss
- BiTemperedLogisticLoss : Robust Bi-Tempered Logistic Loss Based on Bregman Divergences
Diff
pytorch-optimizer v2.10.1
pytorch-optimizer v2.10.0
Change Log
Feature
- Implement Amos optimizer (#174)
- Implement SignSGD optimizer (#176)
- Implement AdaHessian optimizer (#176)
- Implement SophiaH optimizer (#173, #176)
- Implement re-usable functions to compute hessian in
BaseOptimizer(#176, #177)- two types of distribution are supported (
Gaussian,Rademacher).
- two types of distribution are supported (
- Support
AdamDfeature for AdaHessian optimizer (#177)
Diff
Contributions
thanks to @i404788