Evaluation metric in recommender systems?

Hi All,

I just went through the recommender systems article and have a question about the evaluation metric ‘Recall’ and ‘Precision’. Recall is defined as TP/TP+FN.
Example: If a user likes 5 items and the recommendation engine decided to show 3 of them, then the recall will be 0.6.

Let’s say I am using the collaborative filtering approach and have recommended movies then how do I know if whatever I have predicted is liked by the user or not?

Hi @rishabh835,

The example and solution you stated are correct. So out of 5, if I predicted 3 correctly, the precision value will be 3/5 = 0.6. Now this will be for an individual row, so to calculate for the precision for the whole dataset, you need to take the average.

And this average precision is what we call the MAP (mean average precision). Now, in this case, the order of the items recommended does not matter.

© Copyright 2013-2019 Analytics Vidhya