NNlib.jl icon indicating copy to clipboard operation
NNlib.jl copied to clipboard

Bugged fallback in grouped Convs

Open SimonCoste opened this issue 4 years ago • 1 comments

Hi,

I defined a grouped convolution in Flux using C = Conv((1,1), 2=>2, groups=2). When I feed non-float arrays to this convolutional layer, eg with C(rand(10,10,2,1)), I first get a Slow fallback warning, and then an AssertionError: DimensionMismatch, see the stacktrace below.

This error should not be here, and is very misleading since it is by no means a DimensionMismatch problem - the dimensions are ok - but it is apparently linked to the datatypes : indeed, accordingly to the warning, the error disappears when I use C(rand(Float32, 10,10,2,1)).

Classical (non-grouped) convolutions do not display these kind of errors.

julia> C = Conv((1,1), 2=>2, groups=2)
Conv((1, 1), 1 => 2)  # 4 parameters

julia> C(rand(10,10,2,1))
┌ Warning: Slow fallback implementation invoked for conv!  You probably don't want this; check your datatypes.
│   yT = Float64
│   T1 = Float64
│   T2 = Float32
└ @ NNlib ~/.julia/packages/NNlib/P9BhZ/src/conv.jl:291
ERROR: AssertionError: DimensionMismatch("Data input channel count (2 vs. 2)")
Stacktrace:
  [1] check_dims(x::NTuple{5, Int64}, w::NTuple{5, Int64}, y::NTuple{5, Int64}, cdims::DenseConvDims{3, (1, 1, 1), 2, 2, 2, (1, 1, 1), (0, 0, 0, 0, 0, 0), (1, 1, 1), false})
    @ NNlib ~/.julia/packages/NNlib/P9BhZ/src/dim_helpers/DenseConvDims.jl:73
  [2] conv_direct!(y::Array{Float64, 5}, x::Array{Float64, 5}, w::Array{Float32, 5}, cdims::DenseConvDims{3, (1, 1, 1), 2, 2, 2, (1, 1, 1), (0, 0, 0, 0, 0, 0), (1, 1, 1), false}; alpha::Float64, beta::Bool)
    @ NNlib ~/.julia/packages/NNlib/P9BhZ/src/impl/conv_direct.jl:51
  [3] conv_direct!
    @ ~/.julia/packages/NNlib/P9BhZ/src/impl/conv_direct.jl:51 [inlined]
  [4] conv!(y::Array{Float64, 5}, in1::Array{Float64, 5}, in2::Array{Float32, 5}, cdims::DenseConvDims{3, (1, 1, 1), 2, 2, 2, (1, 1, 1), (0, 0, 0, 0, 0, 0), (1, 1, 1), false}; kwargs::Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ NNlib ~/.julia/packages/NNlib/P9BhZ/src/conv.jl:293
  [5] conv!(y::Array{Float64, 5}, in1::Array{Float64, 5}, in2::Array{Float32, 5}, cdims::DenseConvDims{3, (1, 1, 1), 2, 2, 2, (1, 1, 1), (0, 0, 0, 0, 0, 0), (1, 1, 1), false})
    @ NNlib ~/.julia/packages/NNlib/P9BhZ/src/conv.jl:291
  [6] conv!(y::Array{Float64, 4}, x::Array{Float64, 4}, w::Array{Float32, 4}, cdims::DenseConvDims{2, (1, 1), 2, 2, 2, (1, 1), (0, 0, 0, 0), (1, 1), false}; kwargs::Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ NNlib ~/.julia/packages/NNlib/P9BhZ/src/conv.jl:151
  [7] conv!
    @ ~/.julia/packages/NNlib/P9BhZ/src/conv.jl:151 [inlined]
  [8] conv(x::Array{Float64, 4}, w::Array{Float32, 4}, cdims::DenseConvDims{2, (1, 1), 2, 2, 2, (1, 1), (0, 0, 0, 0), (1, 1), false}; kwargs::Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ NNlib ~/.julia/packages/NNlib/P9BhZ/src/conv.jl:91
  [9] conv(x::Array{Float64, 4}, w::Array{Float32, 4}, cdims::DenseConvDims{2, (1, 1), 2, 2, 2, (1, 1), (0, 0, 0, 0), (1, 1), false})
    @ NNlib ~/.julia/packages/NNlib/P9BhZ/src/conv.jl:89
 [10] (::Conv{2, 4, typeof(identity), Array{Float32, 4}, Vector{Float32}})(x::Array{Float64, 4})
    @ Flux ~/.julia/packages/Flux/ZnXxS/src/layers/conv.jl:163
 [11] top-level scope
    @ REPL[5]:1
 [12] top-level scope
    @ ~/.julia/packages/CUDA/YpW0k/src/initialization.jl:52

See also the Julialang discussion.

SimonCoste avatar Dec 27 '21 08:12 SimonCoste

Good eye, we should catch this error in NNlib at the same level as the non grouped versions

DhairyaLGandhi avatar Dec 27 '21 17:12 DhairyaLGandhi

I added a PR that should fix this, https://github.com/FluxML/NNlib.jl/pull/468.

gabrielpreviato avatar Feb 01 '23 20:02 gabrielpreviato

Closing as fixed by #468.

ToucheSir avatar Mar 03 '23 15:03 ToucheSir