tfilte inference collab different than mobile
Hi,
i have trained "ssd_mobilenet_v2_fpnlite_320x320_coco17_tpu-8" on a custome dataset using collab, the tflite model works fine on collab but loading it in flutter gives totally much less accurate results, the result given using object detection sample app doesnt make since,
for example:
tflite inference run on collab:
but running it on a flutter app gives this result
what could be the reason, where i should be looking at
thanks you for any insights
@nelzaatari are you detecting this in realtime?? what packages you are using for this?
same issue too
I got two models (both export from yolov8).
they run on two platform (python with kaggle and tflite_flutter with flutter).
model1 is all right , python and flutter inference both output same result but model2 run on flutter is different then python(kaggle)