@@ -578,7 +578,7 @@ Notice the epoch is MUCH faster!
578578Hyperparameters
579579---------------
580580Normally, we don't hard-code the values to a model. We usually use the command line to
581- modify the network. The ` Trainer ` can add all the available options to an ArgumentParser.
581+ modify the network.
582582
583583.. code-block :: python
584584
@@ -591,9 +591,6 @@ modify the network. The `Trainer` can add all the available options to an Argume
591591 parser.add_argument(' --layer_2_dim' , type = int , default = 256 )
592592 parser.add_argument(' --batch_size' , type = int , default = 64 )
593593
594- # add all the available options to the trainer
595- parser = pl.Trainer.add_argparse_args(parser)
596-
597594 args = parser.parse_args()
598595
599596 Now we can parametrize the LightningModule.
@@ -626,6 +623,22 @@ Now we can parametrize the LightningModule.
626623 .. note :: Bonus! if (hparams) is in your module, Lightning will save it into the checkpoint and restore your
627624 model using those hparams exactly.
628625
626+ And we can also add all the flags available in the Trainer to the Argparser.
627+
628+ .. code-block :: python
629+
630+ # add all the available Trainer options to the ArgParser
631+ parser = pl.Trainer.add_argparse_args(parser)
632+ args = parser.parse_args()
633+
634+ And now you can start your program with
635+
636+ .. code-block :: bash
637+
638+ # now you can use any trainer flag
639+ $ python main.py --num_nodes 2 --gpus 8
640+
641+
629642 For a full guide on using hyperparameters, `check out the hyperparameters docs <hyperparameters.rst >`_.
630643
631644---------
0 commit comments