ReverseDiff.jl
ReverseDiff.jl copied to clipboard
Reverse Mode Automatic Differentiation for Julia
I tried to compute the gradient of a function with quadratic form. However, it failed with ambiguity error as follows: ```julia import ReverseDiff const A = [1.0 2.0; 2.0 5.0]...
When using ReverseDiff over Zygote, its tracked arrays will often be turned into arrays of TrackedReal: ```julia julia> using ReverseDiff, Zygote julia> _, back = pullback(x -> cumsum(x .^ 3),...
I was using ReverseDiff in a function that uses normcdf and observed the following error: > MethodError: promote_rule(::Type{Irrational{:invsqrt2}}, ::Type{ReverseDiff.TrackedReal{Float64,Float64,ReverseDiff.TrackedArray{Float64,Float64,1,Array{Float64,1},Array{Float64,1}}}}) is ambiguous. Candidates: > promote_rule(::Type{R}, ::Type{ReverseDiff.TrackedReal{V,D,O}}) where {R promote_rule(::Type{#s268} where #s268...
I'm trying to calculate gradients with respect to elements of a sparse input array, but this seems to be forbidden due to the check: ```AssertionError: IndexStyle(value) === IndexLinear()``` Could you...
Does ReverseDiff.jl have a something similar to `y,pb=Zygote.pullback(f,x)` which builds a pullback function `pb` that can be called on an output tangent `Δy` and when called internally does the vector-jacobian...
Right now, if I have a function `f(a, b, c)` and I only want to create a function which returns the gradient w.r.t. to `a` and `b`, I have two...
Hi, I need to evaluate the gradient of a function with my custom type/operations. For example if `f(x) = x^3`, I want the derivative obtained with the usual rules of...
Is it possible to define a pullback via the `ReverseDiff.@grad` macro or a `ChainRulesCore.rrule` and use it with a compiled tape? When I try either approach with a simple function...
I asked this question in the #autodiff channel on Slack but haven't found out if I'm doing something wrong, hitting a bug in ReverseDiff, or hitting an incompatibility with DifferentialEquations....