Grad All Zero #811
-
Hi, I coded a custom integrator, defined a loss function between the reference image and the custom rendered image, and want to backpropagate the loss to the custom rendered image. However, I am having the grads all zero right now.
The weird place is that if I backpropagate to image1, then everything is fine, but things do not work when I backpropagate to image2. Thanks for any help in advance! |
Beta Was this translation helpful? Give feedback.
Answered by
zichenwang01
Jul 18, 2023
Replies: 1 comment
-
Nvm, I think I figured out the problem: I need to |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
njroussel
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Nvm, I think I figured out the problem: I need to
dr.set_grad(loss, 1)
before backpropagating!