简体   繁体   English

如何在Tensorflow中获得精度和召回率而不是准确性

[英]How can i get precision & recall instead of accuracy in Tensorflow

I'm seeing Spam Prediction classifying messages as Spam and Ham made by other person. 我看到“垃圾邮件预测”将邮件分类为其他人发送的垃圾邮件和火腿。

[source code] https://github.com/nfmcclure/tensorflow_cookbook/blob/master/09_Recurrent_Neural_Networks/02_Implementing_RNN_for_Spam_Prediction/02_implementing_rnn.py [源代码] https://github.com/nfmcclure/tensorflow_cookbook/blob/master/09_Recurrent_Neural_Networks/02_Implementing_RNN_for_Spam_Prediction/02_implementing_rnn.py

The program produces the following values. 程序产生以下值。 (loss, accuracy) (损失,准确性)

Veiw Result Screenshot Veiw结果截图

In this code, the result is only loss, accuracy, 在这段代码中,结果只是损失,准确性,

I think Accuracy has no meaning. 我认为准确性没有意义。 I need Precision, Recall value (for F1 measure) 我需要精确度,查全率值(用于F1度量)

However, since the my code analysis is not working properly, I know Precision and Recall. 但是,由于我的代码分析无法正常工作,因此我知道Precision和Recall。 But I do not know how to calculate(code embedding) Precision and Recall in this code. 但是我不知道如何在此代码中计算(代码嵌入)Precision和Recall。

I succeeded it myself, hurray !! 我自己成功了,万岁!

here is the code: 这是代码:

actuals = tf.cast(y_output, tf.int64)
predictions = tf.argmax(logits_out, 1)

ones_like_actuals = tf.ones_like(actuals)
zeros_like_actuals = tf.zeros_like(actuals)
ones_like_predictions = tf.ones_like(predictions)
zeros_like_predictions = tf.zeros_like(predictions)

tp_op = tf.reduce_sum(
    tf.cast(
      tf.logical_and(
        tf.equal(actuals, ones_like_actuals), 
        tf.equal(predictions, ones_like_predictions)
      ), 
      "float"
    )
)

tn_op = tf.reduce_sum(
    tf.cast(
      tf.logical_and(
        tf.equal(actuals, zeros_like_actuals), 
        tf.equal(predictions, zeros_like_predictions)
      ), 
      "float"
    )
)

fp_op = tf.reduce_sum(
    tf.cast(
      tf.logical_and(
        tf.equal(actuals, zeros_like_actuals), 
        tf.equal(predictions, ones_like_predictions)
      ), 
      "float"
    )
)

fn_op = tf.reduce_sum(
    tf.cast(
      tf.logical_and(
        tf.equal(actuals, ones_like_actuals), 
        tf.equal(predictions, zeros_like_predictions)
      ), 
      "float"
    )
)

I saw confusion matrix open source in github thank you @Mistobaan !! 我在github中看到了混淆矩阵开源谢谢@Mistobaan !! https://gist.github.com/Mistobaan/337222ac3acbfc00bdac https://gist.github.com/Mistobaan/337222ac3acbfc00bdac

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何使用精度和召回值获得混淆矩阵和准确率? - how to get confusion matrix and the accuracy rate using precision and recall values? 如何从 Tensorflow 二值图像分类中获得召回率和精度 - How to get Recall and Precision from Tensorflow binary image classification 如何使用SVM找到精度,召回,准确度? - How to find the Precision, Recall, Accuracy using SVM? 作为精确度和召回率函数的准确度 - accuracy as a function of precision and recall 如何通过Spark ml lib中的交叉验证获得准确性,召回率和ROC? - How to get accuracy precision, recall and ROC from cross validation in Spark ml lib? 召回后的分类精度和精度 - Classification accuracy after recall and precision 如何使用 python 打印精度、召回率、fscore? - how can i print precision, recall, fscore using python? 如何在scikit-learn中使用k折交叉验证来获得每折的精确召回率? - How can I use k-fold cross-validation in scikit-learn to get precision-recall per fold? 如何解释和研究不平衡数据中的完美准确性,精确度,召回率,F1和AUC(我不信任) - How to interpret and investigate the perfect accuracy, precision, recall, F1, and AUC (which I don't trust) in unbalanced data 如何获得h2o模型的准确度/精度? - How do I get the accuracy/precision of a h2o model?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM