pytorch-optimizer v2.0.0
Chage Log
- Refactor the package depth
- 4 depths
pytorch_optimizer.lr_scheduler: lr schedulerspytorch_optimizer.optimizer: optimizerspytorch_optimizer.base: base utilspytorch_optimizer.experimental: any experimental features
pytorch_optimizer.adamp->pytorch_optimizer.optimizer.adamp- Still
from pytorch_optimizer import AdamPis possible
- 4 depths
- Implement lr schedulers
-
CosineAnealingWarmupRestarts
-
- Implement (experimental) lr schedulers
-
DeBERTaV3-largelayer-wise lr scheduler
-
Other changes (bug fixes, small refactors)
- Fix
AGC(to returning the parameter) - Make a room for
experimental features(atpytorch_optimizer.experimental) - base types