简体   繁体   English

使用梯度下降的线性回归

[英]Linear regression using Gradient Descent

I'm facing some issues trying to find the linear regression line using Gradient Descent, getting to weird results.我在尝试使用梯度下降找到线性回归线时遇到了一些问题,得到了奇怪的结果。 Here is the function:这是 function:

def gradient_descent(m_k, c_k, learning_rate, points):
    n = len(points)
    dm, dc = 0, 0 
    for i in range(n):
        x = points.iloc[i]['alcohol']
        y = points.iloc[i]['total']
        dm += -(2/n) * x * (y - (m_k * x + c_k))  # Partial der in m
        dc += -(2/n) * (y - (m_k * x + c_k))  # Partial der in c
    m = m_k - dm * learning_rate
    c = c_k - dc * learning_rate
    return m, c 

And combined with a for loop并结合一个 for 循环

l_rate = 0.0001
m, c = 0, 0
epochs = 1000

for _ in range(epochs):
    m, c = gradient_descent(m, c, l_rate, dataset)

plt.scatter(dataset.alcohol, dataset.total)
plt.plot(list(range(2, 10)), [m * x + c for x in range(2,10)], color='red')
plt.show()

Gives this result:给出这个结果:

  • Slope: 2.8061974241244196坡度:2.8061974241244196
  • Y intercept: 0.5712221080810446 Y截距:0.5712221080810446

在此处输入图像描述

The problem is though that taking advantage of sklearn to compute the slope and intercept, ie问题是虽然利用 sklearn 来计算斜率和截距,即

model = LinearRegression(fit_intercept=True).fit(np.array(dataset['alcohol']).copy().reshape(-1, 1),
                                                 np.array(dataset['total']).copy())

I get something completely different:我得到了完全不同的东西:

  • Slope: 2.0325063坡度:2.0325063
  • Intercept: 5.8577761548263005截取:5.8577761548263005

在此处输入图像描述

Any idea why?知道为什么吗? Looking on SO I've found out that a possible problem could be a too high learning rate, but as stated above I'm currently using 0.0001看着 SO,我发现一个可能的问题可能是学习率太高,但如上所述,我目前使用的是 0.0001

Sklearn's LinearRegression doesn't use gradient descent - it uses Ordinary Least Squares (OLS) Regression which is a non-iterative method . Sklearn 的线性回归不使用梯度下降 - 它使用普通最小二乘 (OLS) 回归,这是一种非迭代方法

For your model, you might consider randomly initialising m, c rather than starting with 0,0.对于您的 model,您可以考虑随机初始化 m, c 而不是从 0,0 开始。 You could also consider adjusting the learning rate or using an adaptive learning rate.您还可以考虑调整学习率或使用自适应学习率。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM