Loss not going down - first-time user #15056
Unanswered
alexlewisroberts
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
Have you tried printing the loss in your training_step or increasing def training_step(self,batch,batch_idx):
x,y = batch
loss = self.loss(self(x), y)
+ print(loss)
return loss Also, you might want to check our examples here: https://github.com/Lightning-AI/lightning/tree/c39c8eb2e4f49e35ef38a2ae6de2765ec71623db/examples |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I validated that the loss decreases without using Lightning. Here is my code:
Beta Was this translation helpful? Give feedback.
All reactions