Replies: 1 comment 3 replies
-
hey @miccio-dk I think you can use |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, everyone!
I have a use case where I'd like to train a model on a continuous flow of randomly generated data and perform validation every e.g. 1000 steps instead of every epoch. Thus, everything would run within one epoch, and the length of the training would be specified in steps.
Is this doable with pytorch-lightning?
One way I can think of would be to set the dataset
__length__()
to the number of steps between validation cycles and then pick the data randomly (disregarding the index argument in the dataset__getitem()
.Beta Was this translation helpful? Give feedback.
All reactions