简体   繁体   English

不同阈值下的特异性(与 sklearn.metrics.precision_recall_curve 相同)

[英]Specificity at different thresholds (in the same way as sklearn.metrics.precision_recall_curve)

I would like to get the specificities in the same way as precisions and recalls is given by precision_recall_curve .我想以与 precision_recall_curve 给出的precision_recall_curve和召回率相同的方式获得特异性。

precisions, recalls, thresholds = sklearn.metrics.precision_recall_curve(ground_truth, predictions)

How can I achieve that?我怎样才能做到这一点?

So, I looked at the source code for sklearn.metrics.precision_recall_curve ( https://github.com/scikit-learn/scikit-learn/blob/2e90b897768fd360ef855cb46e0b37f2b6faaf72/sklearn/metrics/_ranking.py ) and altered it to fit my needs.所以,我查看了sklearn.metrics.precision_recall_curve的源代码( https://github.com/scikit-learn/scikit-learn/blob/2e90b897768fd360ef855cb46e0b37f2b6faaf72/sklearn/metrics/_ranking.py )并改变了我的需要.

import numpy as np
from sklearn.metrics.ranking import _binary_clf_curve

def specificity_sensitivity_curve(y_true, probas_pred):
    """
    Compute specificity-sensitivity pairs for different probability thresholds.
    For reference, see 'precision_recall_curve'
    """
    fps, tps, thresholds = _binary_clf_curve(y_true, probas_pred)
    sensitivity = tps / tps[-1]
    specificity = (fps[-1] - fps) / fps[-1]
    last_ind = tps.searchsorted(tps[-1])
    sl = slice(last_ind, None, -1)
    return np.r_[specificity[sl], 1], np.r_[sensitivity[sl], 0], thresholds[sl]

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 sklearn.metrics.precision_recall_curve:为什么精度和重新调用返回的数组而不是单个值 - sklearn.metrics.precision_recall_curve: Why are the precision and recall returned arrays instead of single values Python Scikit-调用sklearn.metrics.precision_recall_curve时输入形状错误 - Python Scikit - bad input shape when calling sklearn.metrics.precision_recall_curve sklearn.metrics.precision_recall_curve 中的估计概率(probas_pred)是多少? - What is estimated probability(probas_pred) in sklearn.metrics.precision_recall_curve? 如何在sklearn.metrics中为函数precision_recall_curve和roc_curve获得相同的阈值 - How to get the same thresholds values for both functions precision_recall_curve and roc_curve in sklearn.metrics 绘制阈值(precision_recall 曲线)matplotlib/sklearn.metrics - Plotting Threshold (precision_recall curve) matplotlib/sklearn.metrics 在scikit的precision_recall_curve中,为什么阈值与召回和精确度有不同的维度? - In scikit's precision_recall_curve, why does thresholds have a different dimension from recall and precision? 使用不同分类器的 sklearn precision_recall_curve 函数 - Using sklearn precision_recall_curve function with different classifiers Keras分类器的Sklearn精度,召回率和FMeasure度量 - Sklearn Metrics of precision, recall and FMeasure on Keras classifier sklearn.metrics.roc_curve 只显示 5 fprs、tprs、阈值 - sklearn.metrics.roc_curve only shows 5 fprs, tprs, thresholds sklearn.metrics.precision_recall_fscore_support的输出解释 - Interpretation of the output of sklearn.metrics.precision_recall_fscore_support
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM