简体   繁体   中英

Finding the accuracy from the confusion matrix in pd.crosstab

Using pd.crosstab , I can produce a confusion matrix from my predicted data. I used the following line to generate the confusion matrix:

pd.crosstab(test_data['class'], test_data['predicted'], margins = True)

Similarly in R, I can generate a confusion matrix using the line below

confusion_matrix <- table(truth = data.test$class, prediction = predict(model, data.test[,-46], type = 'class'))

And in R I can find the accuracy of my model using this line

sum(diag(confusion_matrix)) / sum(confusion_matrix)

In Python, is there an equivalent of sum(diag(confusion_matrix)) / sum(confusion_matrix) to calculate the accuracy from my confusion matrix?

I will prefer not to use any libraries except pandas (eg Scikit learn).

You need to use numpy , first use np.diag on the crosstab product to get sum of the diagonal, and then converting the crosstab product to a numpy array before summing:

import numpy as np
np.random.seed(123)
test_data = pd.DataFrame({'class':np.random.randint(0,2,10),
                        'predicted':np.random.randint(0,2,10)})

tab = pd.crosstab(test_data['class'], test_data['predicted'])

predicted   0   1
class       
0   4   3
1   0   3

tab = pd.crosstab(test_data['class'], test_data['predicted'])
np.diag(tab).sum() / tab.to_numpy().sum()
0.7

Or hardcode it? not sure why you want to do this:

(tab.iloc[0,0] + tab.iloc[1,1]) / tab.to_numpy().sum()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM