Skip to content

Commit 0d5f767

Browse files
williamFalconBorda
authored andcommitted
updated docs
1 parent be89eb0 commit 0d5f767

File tree

2 files changed

+32
-4
lines changed

2 files changed

+32
-4
lines changed

docs/source/hyperparameters.rst

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -55,6 +55,21 @@ Now we can parametrize the LightningModule.
5555
.. note:: Bonus! if (hparams) is in your module, Lightning will save it into the checkpoint and restore your
5656
model using those hparams exactly.
5757

58+
And we can also add all the flags available in the Trainer to the Argparser.
59+
60+
.. code-block:: python
61+
62+
# add all the available Trainer options to the ArgParser
63+
parser = pl.Trainer.add_argparse_args(parser)
64+
args = parser.parse_args()
65+
66+
And now you can start your program with
67+
68+
.. code-block:: bash
69+
70+
# now you can use any trainer flag
71+
$ python main.py --num_nodes 2 --gpus 8
72+
5873
Trainer args
5974
^^^^^^^^^^^^
6075

docs/source/introduction_guide.rst

Lines changed: 17 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -578,7 +578,7 @@ Notice the epoch is MUCH faster!
578578
Hyperparameters
579579
---------------
580580
Normally, we don't hard-code the values to a model. We usually use the command line to
581-
modify the network. The `Trainer` can add all the available options to an ArgumentParser.
581+
modify the network.
582582

583583
.. code-block:: python
584584
@@ -591,9 +591,6 @@ modify the network. The `Trainer` can add all the available options to an Argume
591591
parser.add_argument('--layer_2_dim', type=int, default=256)
592592
parser.add_argument('--batch_size', type=int, default=64)
593593
594-
# add all the available options to the trainer
595-
parser = pl.Trainer.add_argparse_args(parser)
596-
597594
args = parser.parse_args()
598595
599596
Now we can parametrize the LightningModule.
@@ -626,6 +623,22 @@ Now we can parametrize the LightningModule.
626623
.. note:: Bonus! if (hparams) is in your module, Lightning will save it into the checkpoint and restore your
627624
model using those hparams exactly.
628625

626+
And we can also add all the flags available in the Trainer to the Argparser.
627+
628+
.. code-block:: python
629+
630+
# add all the available Trainer options to the ArgParser
631+
parser = pl.Trainer.add_argparse_args(parser)
632+
args = parser.parse_args()
633+
634+
And now you can start your program with
635+
636+
.. code-block:: bash
637+
638+
# now you can use any trainer flag
639+
$ python main.py --num_nodes 2 --gpus 8
640+
641+
629642
For a full guide on using hyperparameters, `check out the hyperparameters docs <hyperparameters.rst>`_.
630643

631644
---------

0 commit comments

Comments
 (0)