mobile_models
mobile_models copied to clipboard
MLPerf™ Mobile models
Flutter application downloads models from this repo to enable workload runs. As part of flutter integration , can Intel upload v1.1 version model to this repo now ?
The model downloaded from https://github.com/fatihcakirs/mobile_models/blob/main/v0_7/tflite/mobilebert_int8_384_20200602.tflite Some Fully-connected weights has none-zero zero point (ex. weight `bert/encoder/layer_0/attention/self/MatMul19` has zero-point = 6) , which violate the [TFLite quantization spec](https://www.tensorflow.org/lite/performance/quantization_spec). I am afraid this...
https://mlcommons.org/en/credits/ needs to be updated. Can we fill in https://docs.google.com/spreadsheets/d/17q_OtvVI_C5ET0shUTObHwiplUVRj5gFkU1dTkU7AcQ/edit#gid=0 for MobileNetEdgeTPU DeepLabV3 OLD SSD MobileDETs MobileBERT
We use Stable Diffusion v1.5 models, [Open AI's CLIP model ](https://github.com/openai/CLIP), and COCO 2014 caption dataset. Do we need to put their licenses?