Can we overfit batches if labels are incorrect? #12027
Unanswered
talhaanwarch
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
up to you, but you shouldn't use incorrect labels with overfit_batches, since the idea here is to check whether the model/data pipeline is good enough for further training or not. If you pass in in-correct labels, then the results won't signify what we are trying to infer. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I want to know can we use
overfit batches
to check the sanity of my labels data? For example if i randomly define the labels or if i shuffle the labels with out shuffling corresponding features, can we still overfit_batches and get 100% results?Beta Was this translation helpful? Give feedback.
All reactions