You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.rst
+28-1Lines changed: 28 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,7 +16,7 @@ pytorch-optimizer
16
16
17
17
|**pytorch-optimizer** is optimizer & lr scheduler collections in PyTorch.
18
18
|I just re-implemented (speed & memory tweaks, plug-ins) the algorithm while based on the original paper. Also, It includes useful and practical optimization ideas.
19
-
|Currently, 57 optimizers, 6 lr schedulers are supported!
19
+
|Currently, 57 optimizers, 6 lr schedulers, and 10 loss functions are supported!
20
20
|
21
21
|Highly inspired by `pytorch-optimizer <https://github.com/jettify/pytorch-optimizer>`__.
22
22
@@ -240,6 +240,33 @@ You can check the supported learning rate schedulers with below code.
Copy file name to clipboardExpand all lines: docs/changelogs/v2.11.0.md
+9Lines changed: 9 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,6 +6,15 @@
6
6
*[Closing the Generalization Gap of Adaptive Gradient Methods in Training Deep Neural Networks](https://arxiv.org/abs/1806.06763)
7
7
* Implement LOMO optimizer (#188)
8
8
*[Full Parameter Fine-tuning for Large Language Models with Limited Resources](https://arxiv.org/abs/2306.09782)
9
+
* Implement loss functions (#189)
10
+
* BCELoss
11
+
* BCEFocalLoss
12
+
* FocalLoss : [Focal Loss for Dense Object Detection](https://arxiv.org/abs/1708.02002)
13
+
* FocalCosineLoss : [Data-Efficient Deep Learning Method for Image Classification Using Data Augmentation, Focal Cosine Loss, and Ensemble](https://arxiv.org/abs/2007.07805)
14
+
* DiceLoss : [Generalised Dice overlap as a deep learning loss function for highly unbalanced segmentations](https://arxiv.org/abs/1707.03237v3)
15
+
* LDAMLoss : [Learning Imbalanced Datasets with Label-Distribution-Aware Margin Loss](https://arxiv.org/abs/1906.07413)
16
+
* JaccardLoss
17
+
* BiTemperedLogisticLoss : [Robust Bi-Tempered Logistic Loss Based on Bregman Divergences](https://arxiv.org/abs/1906.03361)
0 commit comments