简体   繁体   中英

statsmodels metric for comparing logistic regression models?

I'm learning about logistic regression by building models in statsmodels .

I know that if I build a linear regression model in statsmodels, lin_mod = sm.OLS(y_var, X_vars).fit() , I can easily get the adjusted R-squared lin_mod.rsquared_adj . I find adjusted R-squared pretty helpful when comparing my linear regression models.

Now for logistic regression models, log_mod = sm.Logit(y_var, X_vars).fit() . I know there is a pseudo-R-squared metric, log_mod.prsquared , but I don't find it very convincing. Is there some other easily accessible metric in statsmodels that might be helpful for comparing logistic regression models?

In statsmodel you could go with

print(lin_mod.summary())

to gather more informations about your model. Otherwise, if you could use sklearn.metrics , try with confusion_matrix or/and accuracy_score

take a closer look on this post by prof. Frank Harrell Statistically Efficient Ways to Quantify Added Predictive Value of New Measurements , he describes it in detail.

TL;DR use Likelihood ratio test in Python

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM