Skip to content

Trainer: loss stagnates, whereas custom train implementation continues converging ?? #12667

Discussion options

You must be logged in to vote

update:

loss = self.criterion(y, y_hat)

to

loss = self.criterion(y_hat, y)

everywhere.

also:

trainer.fit(lightningmodule, train_dataloaders=datamodule.train_dataloader(), val_dataloaders=datamodule.val_dataloader())

can be just

trainer.fit(lightningmodule, datamodule=datamodule)

Replies: 2 comments 2 replies

Comment options

You must be logged in to vote
0 replies
Answer selected by stillsen
Comment options

You must be logged in to vote
2 replies
@rohitgr7
Comment options

@stillsen
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment