简体   繁体   中英

gradient descent with armijo linesearch

I am currently implementing an gradient descent algorithm which includes a dynamic linesearch method using goldstein-armijo - backtracking method. It works to a certain extend, but then fails to converge, to split the stepsize and converge a minimum. Since I am relatively new to the topic, I do not really know how to tweak, or change it to make it converge more.

# starting position
startx = np.array([x,y])

x_arr = [startx]
x = x_arr[-1]
beta = 0.1
alpha = 0.00001

# search direction,
# f_gradient returns the gradient
p = -f_gradient(x_arr[-1])

# cutoff
cutoff_thressh = 1*(10**(-8))

#initial stepsize
stp = 1

derphi = np.dot(f_gradient(x),p)

while curr_it < lim_it:

    if np.linalg.norm(f_gradient(x)) < cutoff_thressh:
        break

    # armijo conditions to reduce stepsize
    while f(x + stp * p) > (f(x) + alpha * stp * derphi)):
        stp*=beta

    gradient_mult = stp*f_gradient(x)
    x_new = np.subtract(x, gradient_mult)
    x_arr.append(x_new.tolist())

any advice is helpful! Thank you

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM