DDP with shared file system #8496
Answered
by
carmocca
YuShen1116
asked this question in
DDP / multi-GPU / multi-node
-
Is it possible to use shared filesystem for DDP init_group in pytorch lighting? If so how what should I do to the Trainer? Thanks! |
Beta Was this translation helpful? Give feedback.
Answered by
carmocca
Jul 22, 2021
Replies: 1 comment
-
I'm not quite sure about what you want to do, but if it's about customizing from pytorch_lightning.plugins import DDPPlugin
class MyCustomDDP(DDPPlugin):
...
trainer = Trainer(plugins=[MyCustomDDP()]) |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
YuShen1116
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I'm not quite sure about what you want to do, but if it's about customizing
DDP
, you can do the following: