简体   繁体   中英

Calculating Mean average precision of a model that does not predict confidence value

How can I calculate the mean average precision in the scenario where my model does not provide confidence values and we only have confusion matrix which is generated from the output? Is it possible?


It would help if you post what library you are using.
But generally speaking, since you have the confusion matrix, you also have the false positive and false negative numbers that you can use to calculate the precision:
Precision = True positive/ ( true positive + false positive)
or in short TP/(TP+FP)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM