MobileNet-SSD-windows icon indicating copy to clipboard operation
MobileNet-SSD-windows copied to clipboard

Is the inference time of MobileNetv2 smaller than v1???

Open yuanze-lin opened this issue 7 years ago • 4 comments

Hello, in this project, is MobileNetv2 more faster? if it's this situation, then what's the fps of v2???

yuanze-lin avatar Jun 18 '18 04:06 yuanze-lin

I have tested. The inference time of MobileNetv2 is about 19ms, and MobileNetv1 is about 11ms.

yiran-THU avatar Jun 18 '18 17:06 yiran-THU

@yiran-THU Thank you for your response, but I read the paper, paper mentioned that the inference time of MibileNet v2 would be less than MobileNet v1, are there any other versions of MobileNet v2??

yuanze-lin avatar Jun 19 '18 14:06 yuanze-lin

The original implement was come from here : https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/detection_model_zoo.md The inference time of v2 was 27 ms and v1 was 31 ms , but it is base on tensorflow framework

In this project , the deploy model of mobilenet-v1 was made by merge.py , which combined batch norm and scale layers into convolution layers , so it will be faster https://github.com/chuanqi305/MobileNet-SSD/blob/master/merge_bn.py

eric612 avatar Jun 19 '18 15:06 eric612

@eric612 Hi, so if you use merge.py deploying the model of mobilenet-v2, is it possible to be faster?

yuanze-lin avatar Jun 19 '18 15:06 yuanze-lin