How to get different random minibatch orders? #12867
Answered
by
jaak-s
jaak-s
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
-
Hi, I'm training DDP with 4 GPUs but noticed that if I rerun the experiment the first epoch has the exact same but random order of the minibatches as the previous experiment. How can I make it so that each time I run the experiment I get a different random order? I'm using PL version is 1.4.5 and pytorch 1.10.0. Thank you |
Beta Was this translation helpful? Give feedback.
Answered by
jaak-s
Apr 23, 2022
Replies: 1 comment
-
Solved. Use |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
rohitgr7
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Solved. Use
seed_everything(random_seed)
.