MONAI icon indicating copy to clipboard operation
MONAI copied to clipboard

Evaluation metrics - Object detection

Open csudre opened this issue 5 years ago • 5 comments

Is your feature request related to a problem? Please describe. Need for the following metrics when dealing with object detection problems

Intersection over Union (IoU) measurements can either be made at the level of the bounding boxes of the objects or of the segmentation themselves.

At the element level

  • Dice per element
  • Centroid distance

At the level of the image considering independent elements:

  • F1 score with IoU thresholds ranging from 0.5 to 0.95
  • Average precision with IoU thresholds ranging from 0.5 to 0.95 https://cocodataset.org/#detection-eval
  • Average recall with IoU thresholds ranging from 0.5 to 0.95 https://cocodataset.org/#detection-eval
  • Correlation of detected volumes
  • Number difference
  • F1 with minimum overlap of 1 voxel
  • Outline error https://link.springer.com/article/10.1186/1471-2342-12-17
  • Detection error https://link.springer.com/article/10.1186/1471-2342-12-17

Additional context This list of metrics is the output of the initial brainstorming of the metrics task force

csudre avatar Oct 22 '20 06:10 csudre

Amazing list of detection metrics, thanks for all the effort of the metrics task force to collect them!

I am curious about the F1 with minimum overlap of 1 voxel metric, could someone add some papers which use that metric? (I would like to read them ^^)

Also a small addition: The Free-Response Receiver Operating Characteristic (FROC) might also be a metric which should be added (used by LUNA , DeepLesion and RibFrac)

mibaumgartner avatar Nov 09 '20 11:11 mibaumgartner

Hello everyone,

I am new here and would like to contribute to implementing some (or all) of the evaluation metrics on @csudre's list. Did anyone start working on them? If not, should I create a pull request (labelled [WIP]) and get going?

Thank you, Ilia

iliathesmirnov avatar Nov 12 '20 15:11 iliathesmirnov

Hello everyone,

I am new here and would like to contribute to implementing some (or all) of the evaluation metrics on @csudre's list. Did anyone start working on them? If not, should I create a pull request (labelled [WIP]) and get going?

Thank you, Ilia

welcome and please go ahead! (I don't think anyone is currently working on them)

wyli avatar Nov 12 '20 17:11 wyli

today we discussed the requirement of FROC curves for lesion based evaluation

wyli avatar Jan 05 '21 17:01 wyli

Please check https://github.com/Project-MONAI/MONAI/blob/dev/monai/apps/detection/metrics/coco.py

Can-Zhao avatar Mar 30 '23 04:03 Can-Zhao