简体   繁体   English

Sklearn Precision 和召回给出错误的值

[英]Sklearn Precision and recall giving wrong values

在此处输入图片说明

Why is my precision score so low in the above image?为什么上图中我的精度分数如此之低?

I see in your comments that you're trying to interpret confusion_matrix as [[tp, fp], [fn, tn]]我在你的评论中看到你试图将混淆[[tp, fp], [fn, tn]]解释为[[tp, fp], [fn, tn]]

Based on documentation , sklearn.confusion_matrix is a function that returns an array of:根据文档sklearn.confusion_matrix是一个函数,它返回一个数组:

[[tn, fp], [fn, tp]]

So, it's vice-versa and the calculation is right:所以,反之亦然,计算是正确的:

397 / (397 + 925) = 0.30030257...

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 sklearn 的 Plot 精度和召回率 - Plot precision and recall with sklearn sklearn.metrics.precision_recall_curve:为什么精度和重新调用返回的数组而不是单个值 - sklearn.metrics.precision_recall_curve: Why are the precision and recall returned arrays instead of single values 使用 sklearn 获得精度和召回率 - Getting Precision and Recall using sklearn 精度,召回率,F1得分与sklearn相等 - Precision, recall, F1 score equal with sklearn Keras分类器的Sklearn精度,召回率和FMeasure度量 - Sklearn Metrics of precision, recall and FMeasure on Keras classifier Sklearn:使用precision_score 和recall_score 计算时召回率和精度是否交换? - Sklearn: are recall and precision swapped when are computed with precision_score and recall_score? 如何在sklearn.metrics中为函数precision_recall_curve和roc_curve获得相同的阈值 - How to get the same thresholds values for both functions precision_recall_curve and roc_curve in sklearn.metrics 如何使sklearn模型达到预定的精度或召回某个类? - How to make sklearn model reach a predefine precision or recall on some class? 与 sklearn 一起交叉验证精度、召回率和 f1 - Cross-validate precision, recall and f1 together with sklearn 在sklearn中进行超参数调整后,查找模型的准确性,精确度和召回率 - Finding accuracy, precision and recall of a model after hyperparameter tuning in sklearn
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM