track_grad_norm with wandb #7440
Unanswered
surya-narayanan
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment 1 reply
-
It seems the self.log(grad_norms) work when we return a loss, not sure why. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I am using version 1.2.4 of PL. I have track_grad_norm = 1 in my Trainer object initialization, but don't see any of the grad norms in my wandb dashboard (There, I see just the things I log with self.log). I was able to see the grad norms in TensorBoard prior, and use a wandb_logger as wandb_logger = WandbLogger(blah).
Is it not supported yet?
Beta Was this translation helpful? Give feedback.
All reactions