Loss discrepancies when using the trainer #15442
Answered
by
martinez-zacharya
martinez-zacharya
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
-
I'm encountering a weird training issue where my loss increases dramatically after one sample (like from 8 to 9.1188e+11) when I return the loss at the end of my training_step() function. However when I don't return anything (comment-out return(loss)), the loss remains stable for all of the samples. Has anyone seen anything like this? Perhaps I am misunderstanding how lightning performs the training. |
Beta Was this translation helpful? Give feedback.
Answered by
martinez-zacharya
Oct 31, 2022
Replies: 1 comment
-
Nevermind, my learning rate was set to 1e5 instead of 1e-5. |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
akihironitta
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Nevermind, my learning rate was set to 1e5 instead of 1e-5.