Lightning module property self.precision is not consistent #7419
Unanswered
karthi0804
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment 5 replies
-
@justusschock @awaelchli this looks like a real issue. from the mixed precision plugin: https://github.com/PyTorchLightning/pytorch-lightning/blob/28103c67c2baa5810d8e76f77176361e3f61076b/pytorch_lightning/plugins/precision/mixed.py#L27 why didn't mypy flag this? @shuyingsunshine21 this might lead to underperformance in fp16 for sharded ddp because of this check: https://github.com/PyTorchLightning/pytorch-lightning/blob/28103c67c2baa5810d8e76f77176361e3f61076b/pytorch_lightning/plugins/training_type/sharded.py#L57 |
Beta Was this translation helpful? Give feedback.
5 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am using pl version 1.3.0. I found that self.precision of pl_module outputs 32 when the precision set to 32 in trainer but outputs 'mixed' when the precision is set to 16 in trainer. Curious as to know why not 16?
Beta Was this translation helpful? Give feedback.
All reactions