We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
use_distributed_sample
use_distributed_sampler
1 parent 3032288 commit 8074219Copy full SHA for 8074219
docs/source-pytorch/upgrade/sections/1_9_advanced.rst
@@ -39,7 +39,7 @@
39
- #16745 #16745
40
41
* - used Trainer’s flag ``replace_sampler_ddp``
42
- - use ``use_distributed_sample``; the sampler gets created not only for the DDP strategies
+ - use ``use_distributed_sampler``; the sampler gets created not only for the DDP strategies
43
-
44
45
* - relied on the ``on_tpu`` argument in ``LightningModule.optimizer_step`` hook
0 commit comments