How is it possible that map_coordinates function is differentiable #19149
-
I had used the map_coordinates function in the definition of loss function of the neural network and it worked - hence I assume that it has to be differentiable (I used Adam optimizer and loss value was steadily dropping).
the very first line is
is not differentiable and even if it would be we are acting here on coordinated to the array (looking for coordinates close by) and this is also not differentiable action, so How is it possible that it works? Thanks for help ! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
print(jax.grad(jnp.floor)(0.5))
# 0.0 The sense in which this is differentiable is that if you vary the input infinitesimally, the output does not change, so by the definition of the derivative, the gradient is zero. Things get a bit more complicated when Does that answer your question? |
Beta Was this translation helpful? Give feedback.
The gradient with respect to an indexing operation where the index is a float converted to an int is similar to the gradient with respect to
floor
: it's always zero (because infinitesimally changing the index cannot change the indexed value):