Question about the amp trainning. #13242
Unanswered
exiawsh
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I have a question about the amp trainning. If I just want to train my own model by using fp16 part of the model, how could I achieved it when using the lightning. That's mean I just train the backbone with fp16, and train the head with fp32. How can I do that with pytorch lightning?
Beta Was this translation helpful? Give feedback.
All reactions