Confused with Multi-GPU in PL #12835
Unanswered
yipliu
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment 2 replies
-
it says:
DP is not distributed training rather just data-parallel. So if you are using DP then go with here for validation_step_end else if using DDP or any other distributed strategy go with Feel free to send a PR to update the docs, if you think they can be improved :) |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I have read the doc here and here for validation_step_end.
My confusion is that we just need to set sync_dis=True in validation_step describe in here or we should do just like in here for validation_step_end.
Beta Was this translation helpful? Give feedback.
All reactions