global_step while using two dataloaders #13993
Unanswered
fugokidi
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment 1 reply
-
Sorry, I found out that it only happens in the hydra-lightning-template. In normal pytorch lightning, |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
While using two data loaders as in
The globa_step from
on_train_batch_end(trainer, pl_module, outputs, batch, batch_idx, unused=0)
in a callback increases by 2. For example,trainer.global_step
starts from 2, 4, 6, 8, .....I'm afraid it will interfere some tracking. Is it intended to increase by 2 while using two data loaders at the same time.
Beta Was this translation helpful? Give feedback.
All reactions