ForwardDiff.jl icon indicating copy to clipboard operation
ForwardDiff.jl copied to clipboard

Incorrect computation of Hessian of squared euclidean norm

Open jomi2801 opened this issue 8 years ago • 2 comments

The computation of the Hessian of the squared euclidean norm results in an interesting output:

using ForwardDiff;
f(x::Vector) = .5*norm(x, 2)^2;
x = zeros(2);
ForwardDiff.hessian(f, x)

which is

 0.0  0.0
 0.0  1.0

However, the codes

f(x::Vector) = .5*x[1]^2 + .5*x[2]^2;
ForwardDiff.hessian(f, x)

and

f(x::Vector) = .5*dot(x, x);
ForwardDiff.hessian(f, x)

provide the correct result. Why is that?

jomi2801 avatar Feb 02 '18 17:02 jomi2801

Possibly similar to https://github.com/JuliaDiff/ForwardDiff.jl/issues/197 i.e. Base's linear algebra code might be optimized in an "AD unsafe" way.

jrevels avatar Feb 06 '18 21:02 jrevels

Is there any chance this gets fixed? Unfortunately, in my case it is not really an option to replace the norm(x)^2 by sum(x.^2) since I have a function calling norm(x) and later another function squaring the result and because the concatenation gives NaN:

julia> using ForwardDiff

julia> f(x) = 0.5*sqrt(sum(x.^2))^2
f (generic function with 1 method)

julia> ForwardDiff.hessian(f, [0.0, 0.0])
2×2 Matrix{Float64}:
 NaN  NaN
 NaN  NaN

Enabling nansafe_mode also doesn't give the correct result:

julia> ForwardDiff.hessian(f, [0.0, 0.0])
2×2 Matrix{Float64}:
 0.0  0.0
 0.0  0.0

So this would require to rewrite my code base, which I would really like to avoid.

JoshuaLampert avatar Apr 19 '24 19:04 JoshuaLampert