Question about the log system #11670
-
Hello, I have some question about the self.log function and batch_size during the trainer. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
hey @exiawsh it should stay as batch_size for a single device only. With DDP if you set |
Beta Was this translation helpful? Give feedback.
hey @exiawsh
it should stay as batch_size for a single device only. With DDP if you set
batch_size=7
, then each device gets the batch ofbatch_size=7
, and effective batch_size increases with the number of devices. Now if you want to log by accumulating metrics across devices, you need to setsync_dist=True
. Check out the section here: https://pytorch-lightning.readthedocs.io/en/latest/extensions/logging.html#automatic-logging