Help with LARS for Barlow Twins #11535
Unanswered
dmandair
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment 1 reply
-
If you are using LARS wrapper, you need to wrap an optimizer within it. However, now on the master branch, we removed the LARS wrapper and instead added LARS, which you can use directly. You'll have to add the warm-up schedule separately from the optimizer. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi and thanks in advance for the help! This is more of a general question but I'm unsure of how to use this LARS wrapper (https://pytorch-lightning-bolts.readthedocs.io/en/0.2.1/api/pl_bolts.optimizers.lars_scheduling.html) with the configure_optimizer function in Pytorch lightening. Would I just wrap an Adams optimizer with this and that would be the sole item returned by the function? Additionally in Barlow twins, the LR had a warm-up scheduling - would this need to be implemented separately or is this taken care of in the implementation or the LARS wrapper? Thanks again for the help!
Beta Was this translation helpful? Give feedback.
All reactions