mean_average_precision icon indicating copy to clipboard operation
mean_average_precision copied to clipboard

what the pr_samples means

Open shentaowang opened this issue 7 years ago • 3 comments

I am puzzled by his meaning.Excuse me, can you answer it?

shentaowang avatar Mar 17 '18 13:03 shentaowang

Hi GeniusLight,

class DetectionMAP:
def __init__(self, n_class, pr_samples=11, overlap_threshold=0.5):
    # Running computation of average precision of n_class in a bounding box + classification task
    # :param n_class:             quantity of class
    # :param pr_samples:          quantification of threshold for pr curve

It is the discretization of the pr curve.

Basically to compute the pr curve we need to compute the Precision and Recall at different levels of confidence. For example, pr_samples of 11 will use "linspace" between 0 to 1, and the pr curve will be computed for each confidence step.

The VOC standard is 11, so usually you don't need to change this parameter.

Hope it helps!

MathGaron avatar Mar 17 '18 15:03 MathGaron

In think the 11 parameter in voc paper means you "linspace" between 0 to 1 and get the precision when the recall value equals this value, finally you average the precision. You can read https://github.com/facebookresearch/Detectron/blob/2f8161edc3092b0382cab535c977a180a8b3cc4d/lib/datasets/voc_eval.py (the code in detectron project).

shentaowang avatar Mar 18 '18 01:03 shentaowang

Ok you are right, we have a slightly different way to handle the confidence, I will definitively check if/how it affects the score. Actually, I will link this to #10, and will compare directly with Detectron's implementation.

I am quite busy right now, I will fix that in the following days. Thank you for your feedback! If you have any pieces of information or have made tests, do not hesitate to share them!

MathGaron avatar Mar 19 '18 19:03 MathGaron