SimCLR train NX-Tent loss dips at regular intervals (related to batch size) #8417
Unanswered
elizastarr
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
@ananyahjha93 Any idea ? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'm training an encoder using SimCLR on my own dataset, and my NX-Tent training loss suddenly dips to the (about) same value for just one iteration about every 25 epochs when I use a batch size of 256 (global batch size of 512 and 771 iterations/epoch) and every 10 epochs when I use a batch size of 350 (global 600 and 564 iterations/epoch). Any ideas for why this is happening would be very much appreciated!
Other settings:
Thanks!!
Beta Was this translation helpful? Give feedback.
All reactions