bert_document_classification
bert_document_classification copied to clipboard
Confusion with result calculations
While observing few predictions, I came across few examples where the model predicted [0,0,0,0]. In such cases, the micro-average F1-score is 0.75. For such cases how do we calculate FPR(False Positive Rate) and TPR(True Positive Rate). Suppose correct label is "Unknown" and the predicted outcome is "[]"(basically, the model predicted no outcome). So what do we imply from such cases?
Kindly please reply as soon as possible. Thank you.