How to calculate confusion metric - error in calculation?




I was going through this very well written piece by Tavish about the Model Performance Metrics:

I had a question with regards to the Confusion Matrix calculation that Tavish has given. The calculation showed by Tavish as reference and the one he has shown as an example (from the Kaggle Competition) it seems that there is an error in that calculation. Specially the specificity and the Negative Predictive Value calculation;

Specificity according to the formula is d/(b+d) if we use this formula on the example below I think the value that would come out would be 98.3% as against the 1.7% that is shown.

There is similar kind of discrepancy in the calculation for the Negative Predictive Value calculation.

I may be wrong on this one but I just wanted to get some clarity regarding the same.




@Amol, it would be helpful to others if you posted either a link to the article or provide the confusion matrix in the post.


Hi Anon,

please refer to the link below: