Cannot log gpu memory without making on step = True while logging train loss #7582
Unanswered
karthi0804
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am using 1.3.0 version. I wanted to just log the gpu memory at all time steps. So I enabled
log_gpu_memory='all'
andlog_every_n_steps=1
. I couldn't find the gpu logging for steps. But If I log the train loss on steps, I can see the gpu logging at each steps. Is this how it is intended to work? I expect that gpu logging should be independent of self.log().Beta Was this translation helpful? Give feedback.
All reactions