-
Notifications
You must be signed in to change notification settings - Fork 6.5k
Open
Description
/home/jhyecheol/.local/lib/python3.8/site-packages/torch/optim/lr_scheduler.py:118: UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()`. In PyTorch 1.1.0 and later, you should call them in the opposite order: `optimizer.step()` before `lr_scheduler.step()`. Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
warnings.warn("Detected call of `lr_scheduler.step()` before `optimizer.step()`. "
I believe this warning is caused by the change of order regarding learning_rate update. As it will be better to inform you, I opened a issue.
This can be easily fixed by adding one conditional variable. However, before I go and try to find a solution for this issue, I want to make sure whether all training will start with epoch 1 or not. If not, I want to know where the starting epoch could be found.
Metadata
Metadata
Assignees
Labels
No labels