You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.rst
+9-9Lines changed: 9 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,7 +16,7 @@ pytorch-optimizer
16
16
17
17
|**pytorch-optimizer** is optimizer & lr scheduler collections in PyTorch.
18
18
|I just re-implemented (speed & memory tweaks, plug-ins) the algorithm while based on the original paper. Also, It includes useful and practical optimization ideas.
19
-
|Currently, 59 optimizers, 10 lr schedulers, and 13 loss functions are supported!
19
+
|Currently, **60 optimizers**, **10 lr schedulers**, and **13 loss functions** are supported!
20
20
|
21
21
|Highly inspired by `pytorch-optimizer <https://github.com/jettify/pytorch-optimizer>`__.
22
22
@@ -31,20 +31,20 @@ So, please double-check the license before using it at your work.
31
31
Installation
32
32
~~~~~~~~~~~~
33
33
34
-
::
34
+
.. code-block:: bash
35
35
36
36
$ pip3 install -U pytorch-optimizer
37
37
38
38
If there's a version issue when installing the package, try with `--no-deps` option.
39
39
40
-
::
40
+
.. code-block:: bash
41
41
42
42
$ pip3 install -U --no-deps pytorch-optimizer
43
43
44
44
Simple Usage
45
45
~~~~~~~~~~~~
46
46
47
-
::
47
+
.. code-block:: python
48
48
49
49
from pytorch_optimizer import AdamP
50
50
@@ -61,7 +61,7 @@ Simple Usage
61
61
62
62
Also, you can load the optimizer via `torch.hub`
63
63
64
-
::
64
+
.. code-block:: python
65
65
66
66
import torch
67
67
@@ -71,7 +71,7 @@ Also, you can load the optimizer via `torch.hub`
71
71
72
72
If you want to build the optimizer with parameters & configs, there's `create_optimizer()` API.
73
73
74
-
::
74
+
.. code-block:: python
75
75
76
76
from pytorch_optimizer import create_optimizer
77
77
@@ -89,7 +89,7 @@ Supported Optimizers
89
89
90
90
You can check the supported optimizers with below code.
91
91
92
-
::
92
+
.. code-block:: python
93
93
94
94
from pytorch_optimizer import get_supported_optimizers
95
95
@@ -230,7 +230,7 @@ Supported LR Scheduler
230
230
231
231
You can check the supported learning rate schedulers with below code.
232
232
233
-
::
233
+
.. code-block:: python
234
234
235
235
from pytorch_optimizer import get_supported_lr_schedulers
236
236
@@ -249,7 +249,7 @@ Supported Loss Function
249
249
250
250
You can check the supported loss functions with below code.
251
251
252
-
::
252
+
.. code-block:: python
253
253
254
254
from pytorch_optimizer import get_supported_loss_functions
0 commit comments