tflite-support
tflite-support copied to clipboard
MobileNetV3 quantization
Hi! I'm trying to quantize MobileNetV3 with tflite, but int8-model performs very poor. I think, it is because of linear quantization, which is too simple method not appropriate for any weights distributions. What else can I try? Are you going to support logarithmic scale for quantization in the future?