Skip to content

Commit f56aeb6

Browse files
committed
docs: create_optimizer
1 parent 935d5ae commit f56aeb6

File tree

1 file changed

+18
-5
lines changed

1 file changed

+18
-5
lines changed

README.rst

Lines changed: 18 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -65,8 +65,25 @@ Also, you can load the optimizer via `torch.hub`
6565
opt = torch.hub.load('kozistr/pytorch_optimizer', 'adamp')
6666
optimizer = opt(model.parameters())
6767

68+
If you want to build the optimizer with parameters & configs, there's `create_optimizer()` API.
6869

69-
And you can check the supported optimizers & lr schedulers.
70+
::
71+
72+
from pytorch_optimizer import create_optimizer
73+
74+
optimizer = create_optimizer(
75+
model,
76+
'adamp',
77+
lr=1e-2,
78+
weight_decay=1e-3,
79+
use_gc=True,
80+
use_lookahead=True,
81+
)
82+
83+
Supported Optimizers
84+
--------------------
85+
86+
You can check the supported optimizers & lr schedulers.
7087

7188
::
7289

@@ -75,10 +92,6 @@ And you can check the supported optimizers & lr schedulers.
7592
supported_optimizers = get_supported_optimizers()
7693
supported_lr_schedulers = get_supported_lr_schedulers()
7794

78-
79-
Supported Optimizers
80-
--------------------
81-
8295
+--------------+-------------------------------------------------------------------------------------------------+-----------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------------+
8396
| Optimizer | Description | Official Code | Paper |
8497
+==============+=================================================================================================+===================================================================================+===============================================================================================+

0 commit comments

Comments
 (0)