简体   繁体   中英

Linear Regression without Least Squares in sklearn

I am working with the LinearRegression module from sklearn.linear_model and I want to compute the parameters of my Linear Regression model without using Least Squares .

For example, I would like to estimate this parameters by minimizing the values from one of the regression metrics defined in module sklearn.metrics (for instance, mean_squared_log_error).

Is there a module that will allow me to easily do this?

You can write your own cost function and call minimize. Be aware that there are no constraints on minimize , so you may want to add some on top of what I'm showing here:

import numpy as np
from sklearn.metrics import mean_squared_log_error
from scipy.optimize import minimize

a = np.array([1, 2, 3])
b = np.array([100, 200, 300])

So here is the model I want to learn (ie the regressor):

def fun(x):
    return a*x

Now here is my cost function:

def cost(x):
    return mean_squared_log_error(b, fun(x))

And now I can optimize it:

print(minimize(cost, x0=[1]))

Be aware that I don't provide a gradient here, so can be slow (or using numerical differences IIRC with some optimizers).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM