Precision defaults on trainer constructor #8966
Unanswered
ananthsub
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm going through the Trainer constructor as part of the API review #7740
amp_backend
on the Trainer constructor have a default value of ‘native’ ? By default, it’s a no-op because the default value for precision is 32. Should this be optional instead?amp_level
have a default value of'O2'
? This can be confusing with the default foramp_backend
because because pytorch native amp doesn’t expose an optimization level setting. This is used only for apex amp. Should this be optional instead?https://github.com/PyTorchLightning/pytorch-lightning/blob/5329b0d11358b72a42d596d3b70b7010e35c45af/pytorch_lightning/trainer/trainer.py#L140-L157
Beta Was this translation helpful? Give feedback.
All reactions