accumulate_grad_batches and DDP #7632
Answered
by
carmocca
RaivoKoot
asked this question in
DDP / multi-GPU / multi-node
-
Hi! batch_size = 16
accumulate_grad_batches=2 my effective batch size is 32. My question is how
is my effective batch size now 64? |
Beta Was this translation helpful? Give feedback.
Answered by
carmocca
May 20, 2021
Replies: 1 comment
-
Yes: https://pytorch-lightning.readthedocs.io/en/latest/advanced/multi_gpu.html#batch-size |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
carmocca
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Yes: https://pytorch-lightning.readthedocs.io/en/latest/advanced/multi_gpu.html#batch-size