Setting 16-bit Precision causes OOM #6860
Unanswered
aleSuglia
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi there,
I'm using PL to train a BERT-based model. However, I have the impression that enabling 16-bit precision in my trainer is not reducing the memory footprint or improving the speed. Actually, 32-bit precision trainer doesn't go OOM however, when I switch to 16-bit, I go OOM. Could you please advise as to how to benefit from this advanced feature?
Beta Was this translation helpful? Give feedback.
All reactions