Skip to content

Commit 57e864a

Browse files
BordaSkafteNicki
andauthored
update
Co-authored-by: Nicki Skafte Detlefsen <[email protected]>
1 parent 033e8e8 commit 57e864a

File tree

1 file changed

+2
-1
lines changed
  • src/lightning/pytorch/strategies

1 file changed

+2
-1
lines changed

src/lightning/pytorch/strategies/ddp.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -422,7 +422,8 @@ def teardown(self) -> None:
422422
class MultiModelDDPStrategy(DDPStrategy):
423423
"""Specific strategy for training on multiple models with multiple optimizers (e.g. GAN training).
424424
425-
This strategy allows to wrap multiple models with DDP, rather than just one which is about just normal DDPStrategy.
425+
This strategy wraps each individual child module in :class:`~torch.nn.parallel.distributed.DistributedDataParallel` module.
426+
Ensures manual backward only updates parameters of the targeted child module, preventing cross-references between modules' parameters.
426427
427428
"""
428429

0 commit comments

Comments
 (0)