简体   繁体   中英

How is the precision and recall calculated in the classification report?

Confusion Matrix :

[[4 2]

 [1 3]]

Accuracy Score : 0.7

Report :

              precision    recall  f1-score   support

          0       0.80      0.67      0.73         6

          1       0.60      0.75      0.67         4

avg / total       0.72      0.70      0.70        10

from the formular precision = true positive/(true positive + false positive)

4/(4+2) = 0.667

But this is under recall .

The formula to calculate recall is true positive/(true positive + false negative)

4/(4+1) = 0.80

I don't seem to get the difference .

Hard to say for sure without seeing code but my guess is that you are using Sklearn and did not pass labels into your confusion matrix. Without labels, it makes decisions about the ordering leading to false positives and false negatives being swapped by interpretting the confusion matrix.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM