Operator requests
Currently dabnn only support primitive operators, such as Conv, FullyConnected, Pool, Add, Concat and ReLU. While they are already enough for many networks (e.g., ResNet, SqueezeNet), some more operators may also be needed for more complicated networks.
Moreover, I believe that the deployment of BNNs needs efforts from both the researchers of BNNs and the engineers of inference frameworks (like dabnn). So there should be a way to enable communication between BNN researchers and dabnn developers.
Therefore, this issue is opened for collecting the request of operators. It helps me to prioritize the various operators. Please reply to this issue if you want some operators that have not been implemented in dabnn, then I will implement them as long as I have enough time.
Requested operators list:
- [x] PReLU
- [x] Reshape/Flatten before GEMM
@i-amgeek Thanks for your report!
Generally reshape is hard to implement since the difference between NCHW (ONNX) and NHWC (dabnn). However, the cases that reshape is before gemm/matmul are trivial. I'll support them soon.
Resize, ConvTranspose, Transpose, Clip, LeakRelu, THANK!
How about 3D Conv/Pooling layers?