We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 033e8e8 commit 57e864aCopy full SHA for 57e864a
src/lightning/pytorch/strategies/ddp.py
@@ -422,7 +422,8 @@ def teardown(self) -> None:
422
class MultiModelDDPStrategy(DDPStrategy):
423
"""Specific strategy for training on multiple models with multiple optimizers (e.g. GAN training).
424
425
- This strategy allows to wrap multiple models with DDP, rather than just one which is about just normal DDPStrategy.
+ This strategy wraps each individual child module in :class:`~torch.nn.parallel.distributed.DistributedDataParallel` module.
426
+ Ensures manual backward only updates parameters of the targeted child module, preventing cross-references between modules' parameters.
427
428
"""
429
0 commit comments