Metrics
Metrics copied to clipboard
Fix average precision at k calculation
This PR fixes #49
According to the Wikipedia page of Average Precision the equation is defined as follow:
where rel(k) is an indicator function equaling 1 if the item at rank k is a relevant document, zero otherwise. Note that the average is over all relevant documents, and the relevant documents not retrieved get a precision score of zero.
Before, the average was calculated over the minimum value between the length of the actual value and k. This doesn't seem right since the length of the actual list of k increases; the AP@K will decrease.
I fixed and cleaned up the code. Please consider merging this! This could lead to many mistakes.