[英]Difference between statsmodel OLS and scikit linear regression; different models give different r square
I am new to python and trying to calculate a simple linear regression. 我是python的新手,正在尝试计算简单的线性回归。 My model has one dependent variable and one independent variable. 我的模型有一个因变量和一个自变量。 I am using linear_model.LinearRegression() from sklearn package. 我正在使用sklearn包中的linear_model.LinearRegression()。 I got an R square value of .16 Then I used import statsmodels.api as sm mod = sm.OLS(Y_train,X_train) and I got an R square of 0.61. 我得到的R平方值为.16,然后使用import statsmodels.api作为sm mod = sm.OLS(Y_train,X_train),得到的R平方值为0.61。 below is the code starting from getting data from big query 以下是从大查询获取数据开始的代码
****Code for linear regression****
train_data_df = pd.read_gbq(query,project_id)
train_data_df.head()
X_train = train_data_df.revisit_next_day_rate[:, np.newaxis]
Y_train = train_data_df.demand_1yr_per_new_member[:, np.newaxis]
#scikit-learn version to get prediction R2
model_sci = linear_model.LinearRegression()
model_sci.fit(X_train, Y_train)
print model_sci.intercept_
print ('Coefficients: \n', model_sci.coef_)
print("Residual sum of squares %.2f"
% np.mean((model_sci.predict(X_train) - Y_train ** 2)))
print ('Variance score: %.2f' %model_sci.score(X_train, Y_train))
Y_train_predict = model_sci.predict(X_train)
print ('R Square', r2_score(Y_train,Y_train_predict) )
****for OLM****
print Y_train[:3]
print X_train[:3]
mod = sm.OLS(Y_train,X_train)
res = mod.fit()
print res.summary()
I am very new to this. 我对此很陌生。 Trying to understand which linear regression package should i use? 试图了解我应该使用哪种线性回归软件包?
Found out the difference. 找出差异。 It was the intercept. 那是拦截。 OLS does not take it by default. OLS默认情况下不使用它。 so by adding below code the answers matched. 因此,通过在下面的代码中添加匹配的答案。
X = sm.add_constant(X)
sm.OLS(y,X)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.