-
Notifications
You must be signed in to change notification settings - Fork 154
Description
Hi! I am having trouble differentiating a function with ForwardDiff.jl. I don't know why, but the resulting jacobian contains NaNs. Below is a MWE to kickstart the discussion.
Consider the following functions:
using BenchmarkTools, ForwardDiff, LinearAlgebra, StaticArrays
foo2(A) = SVector{4}(norm(r, 2) for r = eachrow(A))
function foo1!(out, x; μ = 0.8 / √2)
λ = SVector{3}(@view x[1:3])
K = SMatrix{3,3}(@view x[4:12])
A_λ = @SMatrix [ 1 0 -1 0 ;
0 1 0 -1 ;
μ μ μ μ ]
out[1:4] = A_λ' * λ + foo2(A_λ' * K)
out
endLet's try out foo1!:
julia> x = rand(12);
julia> out = zeros(4);
julia> foo1!(out, x)
4-element Vector{Float64}:
2.403435298832313
2.0030472808350077
0.7216049166673891
0.6293600802591698foo1! is type-stable and does not perform dynamic allocations:
julia> @btime $foo1!($out, $x)
21.684 ns (0 allocations: 0 bytes)
4-element Vector{Float64}:
2.403435298832313
2.0030472808350077
0.7216049166673891
0.6293600802591698We can use ForwardDiff.jl to compute the jacobian of foo1!:
julia> ForwardDiff.jacobian(foo1!, out, x)
4×12 Matrix{Float64}:
1.0 0.0 0.565685 0.606882 0.0 0.343304 … 0.353444 0.491234 0.0 0.277884
0.0 1.0 0.565685 0.0 0.744664 0.421245 0.22503 0.0 0.535939 0.303173
-1.0 0.0 0.565685 0.722007 0.0 -0.408429 -0.362283 0.261825 0.0 -0.14811
0.0 -1.0 0.565685 0.0 0.930926 -0.526611 -0.154069 0.0 0.243306 -0.137634However, if the inputs are all zeros, the jacobian will contain NaNs:
julia> x = zeros(12);
julia> ForwardDiff.jacobian(foo1!, out, x)
4×12 Matrix{Float64}:
NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaNBut if we change the type of the matrix A_λ (in foo1!) from an SMatrix to a normal Matrix, the jacobian will be evaluated properly:
julia> function foo1!(out, x; μ = 0.8 / √2)
λ = SVector{3}(@view x[1:3])
K = SMatrix{3,3}(@view x[4:12])
A_λ = [ 1 0 -1 0 ;
0 1 0 -1 ;
μ μ μ μ ]
out[1:4] = A_λ' * λ + foo2(A_λ' * K)
out
end
foo1! (generic function with 1 method)
julia> ForwardDiff.jacobian(foo1!, out, x)
4×12 Matrix{Float64}:
1.0 0.0 0.565685 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.565685
0.0 1.0 0.565685 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.565685
-1.0 0.0 0.565685 0.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.565685
0.0 -1.0 0.565685 0.0 0.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.565685Moreover, I have also observed that the jacobian will not contain NaNs if we use the 1-norm or the Inf-norm, even if we keep A_λ as an SMatrix, i.e.,
foo2(A) = SVector{4}(norm(r, 1) for r = eachrow(A))or
foo2(A) = SVector{4}(norm(r, Inf) for r = eachrow(A))I am not sure if this is a bug or if I am doing something wrong... Can someone help me figure it out? Thank you in advance!