Module not able to find parameters requiring a gradient #17555
Unanswered
surya-narayanan
asked this question in
DDP / multi-GPU / multi-node
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have a model that can run for 1 GPU, but doesn't work on 2 GPUs.
RuntimeError: DistributedDataParallel is not needed when a module doesn't have any parameter that requires a gradient
I see online that this error relates to a case where someone is forward propagating but not training. I am training, and have train dataloaders- so I’m not able to understand why I am facing this issue. Any thoughts?
Beta Was this translation helpful? Give feedback.
All reactions