AD Update Parameter #816
Answered
by
njroussel
zichenwang01
asked this question in
Q&A
-
Hi, I want to forward a single parameter to the rendered image and optimize the parameter. However, right now once I update the parameter, the gradients all become zero. Am I updating the parameter incorrectly?
|
Beta Was this translation helpful? Give feedback.
Answered by
njroussel
Jul 28, 2023
Replies: 1 comment 4 replies
-
Btw, if I do not update the parameter with |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi @zichenwang01
I think this just basic Python at play After the
p = p - 1e-2 * dr.grad(loss)
thep
variable in the body no longer matches thep
in the integratorsdf_integrator
. You're then trying tobackward_to(p)
which is thep
defined in the loop body and not the integrator and is therefore completely detached from the loss computation (this will obviously give a 0 gradient) .This is hard to write out clearly, I hope it was clear. Your issue should be solved by doing something like
sdf_integrator.p = p
.Without a full minimal reproducer I can't tell for sure.