torchsparse icon indicating copy to clipboard operation
torchsparse copied to clipboard

[Feature Request] Support for group convolution

Open digital-idiot opened this issue 4 years ago • 5 comments

Grouped convolution is supported well in pytorch's convolution layers / ops. If possible, it would a great to add that ability to torchsparse.

digital-idiot avatar Mar 31 '22 15:03 digital-idiot

Thanks for bringing this up! The reason that we do not support grouped convolution is that it does not offer much speedup for sparse workloads. This is mainly because sparse convolution is memory-bounded instead of computation-bounded. That said, I think supporting this is still meaningful, and we probably need to do more optimization for it.

zhijian-liu avatar Apr 01 '22 04:04 zhijian-liu

I'm also interested in support for grouping operations~

For the design of lightweight networks such as MobileNet, the use of depthwise separable convolutions (that is, setting the number of groups to the number of input channels) can reduce the amount of parameters.

For a kernel size of KxK, C channels (assuming the same number of input and output channels)

parameters 2D 3D
convolution KxKxCxC KxKxKxCxC
depthwise separable convolution KxKxC + 1x1xCxC KxKxKxC + 1x1x1xCxC

The bottleneck of computing relative to IO may be broken by increasing the size of the convolution kernel. RepLKNet made an attempt, arXiv: https://arxiv.org/abs/2203.06717.

ruanych avatar Apr 07 '22 05:04 ruanych

Thanks for providing the model size perspective! We will take that into our consideration.

zhijian-liu avatar Apr 08 '22 17:04 zhijian-liu

Is there any update on the group sparse convolution? I am trying to build some Capsule layers using 3d depthwise convolution.

hontrn9122 avatar Apr 13 '24 07:04 hontrn9122