Learning rate increases where it should decrease #3641
Unanswered
PushpakBhoge512
asked this question in
Q&A
Replies: 1 comment
-
The default learning rate warmup is 1000 (SOLVER.WARMUP_ITERS) so it is expected to increase for 1000 iterations. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi all
I have set the following config for model Misc/cascade_mask_rcnn_R_50_FPN_3x.yaml
(tested with other model same issue for other model as well models)
cfg.SOLVER.BASE_LR = 0.001
cfg.SOLVER.WEIGHT_DECAY = 0.0001
cfg.SOLVER.GAMMA = 0.01
cfg.SOLVER.LR_SCHEDULER_NAME = "WarmupMultiStepLR"
cfg.SOLVER.MAX_ITER = 10000
cfg.SOLVER.CHECKPOINT_PERIOD = 2000
cfg.SOLVER.STEPS = [2000 4000, 6000, 8000
Now, the learning rate should decrease with the iteration but it increases with every iteration
Why is that is there anything I am missing here ?
Beta Was this translation helpful? Give feedback.
All reactions