简体   繁体   中英

Scikit - Linear Regression - Extracting Metrics

  • Python 3
  • scikit-learn-0.23.1
  • numpy-1.18.4
  • scipy-1.4.1

Is it possible to extract the number of iterations it took to fit the regression model (by Gradient Descent)? Aside from the general model evaluation metrics, I don't see anything related to iteration number.

regressor = LinearRegression()  
regressor.fit(xtr, ytr)

print('Mean Absolute Error:', metrics.mean_absolute_error(yt, y_pred))  
print('Mean Squared Error:', metrics.mean_squared_error(yt, y_pred))  
print('Root Mean Squared Error:', np.sqrt(metrics.mean_squared_error(yt, y_pred)))

Scikit documentation showing all possible metric modules

As Pointed out by Gilad , no iterations are involved to calculate the linear regression problem using OLS. I am guessing here that you are trying to achieve the linear regression model using Gradient Descent . For the later one, number of iterations are defined prior the execution.

So to use the Gradient Descent , you can manually write the algorithm ( Another link ) or you can use the Ridge Regression or Lasso Regression from the Sklearn. Here there is an n_iter paramter by which you can achieve the number of iterations it took to determine the coefs.

class sklearn.linear_model.Ridge (alpha=1.0, *, fit_intercept=True, normalize=False, copy_X=True, max_iter=None, tol=0.001, solver='auto', random_state=None)

Attributes

n_iter : _None or ndarray of shape (n_targets,) Actual number of iterations for each target. Available only for sag and lsqr solvers. Other solvers will return None.

Note : You can define the solver as sag or saga to use the gradient Descent method.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM