简体   繁体   English

线性回归和梯度下降

[英]Linear Regression and gradient descent

In Linear Regression, we have formulas to calculate the slope and intercept, to find the best fit line;在线性回归中,我们有计算斜率和截距的公式,以找到最佳拟合线; then why do we need to use Gradient Descent for calculating the optimum slope & intercept, which we already get by given formulas?那么为什么我们需要使用梯度下降来计算我们已经通过给定公式得到的最佳斜率和截距?

I guess you are referring to the closed form solution of the linear regression.我猜你指的是线性回归的封闭形式解决方案。

And yes - you can totally fine use it for that purpose.是的——你完全可以用它来达到这个目的。 However, this only works as long as you have all your data being able to fit into memory, which you wouldn't necessarily need when using gradient descend: Since Gradient Descend is an iterative process you can swap in and out some data making it more appropriate for big data.但是,这只有在您的所有数据都能够放入 memory 时才有效,在使用梯度下降时您不一定需要它:由于梯度下降是一个迭代过程,您可以换入和换出一些数据,使其更多适合大数据。

See the Cross Validated thread Solving for regression parameters in closed-form vs gradient descent for more details.有关更多详细信息,请参阅封闭形式与梯度下降中回归参数的交叉验证线程求解

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM