UnderstandingNLP
UnderstandingNLP copied to clipboard
how can i run inference on tflite model
I have convert a hugging face pytorch masked language model using your notebook,it worked awesome. Now i am unable to use the converted tflite model for inference?
example:
original model took input as a sentence for e.g "I'm a student at
Hi @ShahzebAli42, I checked this issue, If we run on GPU the run time is crashing in colab. On CPU if you wanna run TFLite inference for MLM task you can use code like this notebook