'RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn' when using optimizers at arbitrary intervals with automatic optimization. #8023
Unanswered
ethanherron
asked this question in
code help: CV
Replies: 1 comment 1 reply
-
Hi @ethanherron I tried to reimplement your problem with our BoringModel but it all worked fine. Usually this error happens if the loss returned by the training step doesn't have gradient. Are you freezing parts of your model or loss between steps? Sorry that I couldn't be of more help. If you're able to reproduce the problem using our BoringModel then we would likely be able to assist you further. Thanks 😃 |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello, I am getting the following error when trying to override optimizer_step() to execute optimization steps at arbitrary intervals.
RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn
The code runs without trying to override the optimizer_step() function. And when I do override the optimizer_step() function I have made sure each optimizer used is appropriately accounted for.
For reference I have included the training_step(), configure_optimizers(), and optimizer_step() code for the first two optimizers below. Nonessential code has been replace with comments for clarity in this discussion post.
Any help or ideas are greatly appreciated.
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions