dabnn
dabnn copied to clipboard
dabnn is an accelerated binary neural networks inference framework for mobile platform
terminate called after throwing an instance of 'std::out_of_range' what(): _Map_base::at Aborted (core dumped)
Hello guys, I was very interesting by what you've been building, and wanted to ask you a bit more, if you've some time to direct me. I wanted to know:...
Currently dabnn only support primitive operators, such as Conv, FullyConnected, Pool, Add, Concat and ReLU. While they are already enough for many networks (e.g., ResNet, SqueezeNet), some more operators may...
Hello, I managed to get a converted model working using a Linux VM, but found two issues with the `0.2.6` .exe file when I tried doing the same thing in...
**benchmark complile error** **device**: jetson nano **os**: ubuntu18.04 **gcc7.4/g++7.4/cmake3.10.2** when i try to compile dabnn and its third part code on jetson nano ubuntu16.04(arm-noAndroid), cmake .. is all right when...
Hi I am trying to run dabnn on a non-Android ARM device (rPi 3 B+) and have successfully cloned and built the project according to the build.md Do you have...
First of all, thank you for your hard work in providing the community with such a good framework! I've been digging through many binary network papers & repos because I...
Hi~I have tested your pre-trained Bi-Real Net (18-layer) on Huawei Mate30 Pro and I got an inference time of 20ms which was nice. However, when I built dabnn on Hi3516dv300...
Hi~really appreciated for this excellent open source framework! I have some trouble integrating your standard binarization procedure (padding rules, etc.) with some open source Bi-Real Net implementations. Could you please...
Hi, I want to converse an .onnx model to a dabnn model. But after i ran the given command, there is no any output. Could you please tell me some...