[英]Linear Regression with positive coefficients in Python
I'm trying to find a way to fit a linear regression model with positive coefficients.我试图找到一种方法来拟合具有正系数的线性回归模型。
The only way I found is sklearn's Lasso model , which has a positive=True
argument, but doesn't recommend using with alpha=0 (means no other constraints on the weights).我找到的唯一方法是sklearn 的 Lasso 模型,它有一个
positive=True
参数,但不建议使用 with alpha=0 (意味着对权重没有其他限制)。
Do you know of another model/method/way to do it?你知道另一种模型/方法/方法吗?
IIUC, this is a problem which can be solved by the scipy.optimize.nnls
, which can do non-negative least squares. IIUC,这是一个可以通过
scipy.optimize.nnls
解决的scipy.optimize.nnls
,它可以做非负最小二乘法。
Solve argmin_x ||
解决argmin_x || Ax - b ||_2 for x>=0.
Ax - b ||_2 对于 x>=0。
In your case, b is the y , A is the X , and x is the β (coefficients), but, otherwise, it's the same, no?在您的情况下, b是y , A是X , x是β (系数),但是,否则,它是相同的,不是吗?
Many functions can keep linear regression model with positive coefficients.许多函数可以保持线性回归模型的正系数。
As of version 0.24, scikit-learn LinearRegression
includes a similar argument positive
, which does exactly that;从 0.24 版本开始,scikit-learn
LinearRegression
包含一个类似的参数positive
,它正是这样做的; from the docs :从文档:
positive : bool, default=False
正:布尔值,默认值=假
When set to
True
, forces the coefficients to be positive.当设置为
True
,强制系数为正。 This option is only supported for dense arrays.仅密集阵列支持此选项。
New in version 0.24.
0.24 版中的新功能。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.