MTAD icon indicating copy to clipboard operation
MTAD copied to clipboard

Incorrect evaluation metrics

Open mohayl opened this issue 3 years ago • 0 comments

In the evaluation of binary metrics, precision_score, recall_score and f1_score get label and score in wrong order: **

def compute_binary_metrics(anomaly_pred, anomaly_label, adjustment=False): if not adjustment: eval_anomaly_pred = anomaly_pred metrics = { "f1": f1_score(eval_anomaly_pred, anomaly_label), "pc": precision_score(eval_anomaly_pred, anomaly_label), "rc": recall_score(eval_anomaly_pred, anomaly_label), }

**

It should be: **

def compute_binary_metrics(anomaly_pred, anomaly_label, adjustment=False): if not adjustment: eval_anomaly_pred = anomaly_pred metrics = { "f1": f1_score(anomaly_label,eval_anomaly_pred), "pc": precision_score(anomaly_label, eval_anomaly_pred), "rc": recall_score(anomaly_label, eval_anomaly_pred), }

**

mohayl avatar Jan 19 '23 15:01 mohayl