find_unused_parameters in the lightning trainer (1.3.2) #7796
Answered
by
awaelchli
shamanez
asked this question in
DDP / multi-GPU / multi-node
-
🐛 BugIf there are unsued parameters in the model, should we still explicitly mentioned the trainer as follows? plugins=[DDPPlugin(find_unused_parameters=True)]
|
Beta Was this translation helpful? Give feedback.
Answered by
awaelchli
Jun 1, 2021
Replies: 2 comments
-
No, it is set to True by default and you want to turn it off unless you need it :) |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
awaelchli
-
Thanks. Git it |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
No, it is set to True by default and you want to turn it off unless you need it :)
Reference:
https://pytorch-lightning.readthedocs.io/en/1.3.3/benchmarking/performance.html#when-using-ddp-set-find-unused-parameters-false