简体   繁体   中英

Why won't this simple linear regression with gradient descend works?

I'm new in machine learning and I'm trying to do a linear regression for f(x)=kx by gradient descend. And

d(f(x)-y)^2 / dk 

=2(f(x)-y) * d(kx-y) / dk

=2x(f(x)-y)

=2x(kx-y)

So update k by k = k - rate * 2x(kx-y) ,by gradient descend.

And this is exactly how it's said on the textbook, so I thought this will work :-(

from random import uniform
k,k0=uniform(-100,100),uniform(-100,100)
for _ in range(10):
    x=uniform(-100,100)
    k=k-0.01*x*(k*x-k0*x)
    print k,k0

Sadly, the output:

-2639.75970458 -72.294275335
56444.9277867 -72.294275335
-350533.559366 -72.294275335
-315222.824967 -72.294275335
26481249.7869 -72.294275335
25795070.4808 -72.294275335
-329558179.012 -72.294275335
22212688252.9 -72.294275335
-2.2317104093e+11 -72.294275335
1.61788553661e+12 -72.294275335

k deviates from k0 in upsetting speed :-(

I've already read wiki,google and the questionsrecommended on the right of this page, but got no idea :-( Tnanks a lot

Make your "learning rate" (eg 0.01) smaller and the number of iterations, N , larger:

from random import uniform
learning_rate = 0.0001
N = 100
k, k0 = uniform(-100, 100), uniform(-100, 100)
for _ in range(N):
    x = uniform(-100, 100)
    k = k - learning_rate * x * (k * x - k0 * x)
    print k, k0

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM