How are lr_scheduler and EMA used in LightningLite? #12007
Unanswered
Breeze-Zero
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
hey @834799106 ! It should be as simple as it is done with pytorch loops. class Lite(LightningLite):
def run(self, args):
model = MyModel(...)
optimizer = torch.optim.SGD(model.parameters(), ...)
model, optimizer = self.setup(model, optimizer)
lr_scheduler = LRScheduler(optimizer, ...)
ema = ExponentialMovingAverage(model.parameters(), ...)
dataloader = DataLoader(MyDataset(...), ...)
dataloader = self.setup_dataloaders(dataloader)
model.train()
for epoch in range(args.num_epochs):
for batch in dataloader:
optimizer.zero_grad()
loss = model(batch)
self.backward(loss)
optimizer.step()
lr_scheduler.step()
ema.update()
Lite(...).run(args) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, I want to use lr_scheduler and EMA in LightningLite, but I didn't see how to use them in the tutorial. I have to use multiple Gpus, so I'm not sure about this part how to work in Lite.
Beta Was this translation helpful? Give feedback.
All reactions