bugs of integer division and subplot for one class
Thanks for your work! It's really handy, saving me a lot of time when learning YOLO in keras :)
I understand this program is for python 3. I run it with python 2.7 though, and a small modification will get it work.
In File "mean_average_precision/mean_average_precision/utils/bbox.py", function jaccard returns 0 when the coordinates of bounding boxes are all integers, because python 2.7 yields an integer for integer division. Modifying line 49 from
return inter / union
to
return inter.astype('float32') / union
works well.
Second, when calculating mAP for just one class, in File "mean_average_precision/mean_average_precision/detection_map.py", function plot raises an error
Traceback (most recent call last):
File "mean_average_precision/detection_map.py", line 229, in plot
for i, ax in enumerate(axes.flat):
AttributeError: 'AxesSubplot' object has no attribute 'flat'
because when grid == 1, fig, axes = plt.subplots(nrows=grid, ncols=grid) return axes as type of matplotlib.axes._subplots.AxesSubplot.
I modify it from
grid = int(math.ceil(math.sqrt(self.n_class)))
fig, axes = plt.subplots(nrows=grid, ncols=grid)
mean_average_precision = []
# TODO: data structure not optimal for this operation...
for i, ax in enumerate(axes.flat):
if i > self.n_class - 1:
break
precisions, recalls = self.compute_precision_recall_(i, interpolated)
average_precision = self.compute_ap(precisions, recalls)
self.plot_pr(ax, i, precisions, recalls, average_precision)
mean_average_precision.append(average_precision)
to
grid = int(math.ceil(math.sqrt(self.n_class)))
if grid == 1:
fig, ax = plt.subplots()
mean_average_precision = []
precisions, recalls = self.compute_precision_recall_(0, interpolated)
average_precision = self.compute_ap(precisions, recalls)
self.plot_pr(ax, 0, precisions, recalls, average_precision)
mean_average_precision.append(average_precision)
else:
fig, axes = plt.subplots(nrows=grid, ncols=grid)
mean_average_precision = []
# TODO: data structure not optimal for this operation...
for i, ax in enumerate(axes.flat):
if i > self.n_class - 1:
break
precisions, recalls = self.compute_precision_recall_(i, interpolated)
average_precision = self.compute_ap(precisions, recalls)
self.plot_pr(ax, i, precisions, recalls, average_precision)
mean_average_precision.append(average_precision)
to get it work, but I think it can be more precise .
Hi, thanks for the bug report! I will fix this as soon as possible! I am quite busy right now but it should be pushed somewhere around next week.
Hi YuCosine, The source code that you fixed it run, but the result still wrong when we only predict one class. For example:
pred_bb1 = np.array([]) pred_cls1 = np.array([],dtype=int) pred_conf1 = np.array([]) gt_bb1 = np.array([[0.86132812, 0.48242188, 0.97460938, 0.6171875], [0.18554688, 0.234375, 0.36132812, 0.41601562], [0., 0.47265625, 0.0703125, 0.62109375], [0.47070312, 0.3125, 0.77929688, 0.78125], [0.8, 0.1, 0.9, 0.2]]) gt_cls1 = np.array([1, 1, 1, 1, 1,1 ])
Because the model didn't predict object so Mean Average Precision = 0, but we get Mean Average Precision = 1
@TrungHieu-Le
With only one class, the gt of class label should be 0.
I modify the last line to gt_cls1 = np.array([0, 0, 0, 0, 0,0 ]) and get MAP=0
@MathGaron Thank for your help. Have a nice weekend!
Actually @YuCosine did all the work for now! Thanks again!