I am trying to run a learning rate, and am getting this error value error: ValueError: x and y must have same first dimension, but have shapes (101,) and (100,). If I make the y 101, it gives a straight line instead
def g(x):
return x**4 - 4*x**2 +5
def dg(x):
return 4*x**3 - 8*x
def gradient_descent(derivative_func, initial_guess,multiplier=0.01,precision=0.0001, max_iter=500):
new_x= initial_guess
precision= 0.0001
x_list= []
slope_list= []
for n in range(max_iter):
previous_x= new_x
gradient= derivative_func(previous_x) #The slope of the graph(Error)
new_x= previous_x - multiplier *gradient #The Learning rate
step_size= abs(new_x - previous_x)
x_list.append(new_x)
slope_list.append(derivative_func(new_x))
# print (step_size)
if step_size < precision:
break
return new_x, x_list, slope_list
n=100
local_min, list_x, deriv_list= gradient_descent(derivative_func=dg,initial_guess=0.1, multiplier=0.0005,precision=0.0001,max_iter=n)
#Plotting Reduction in cost for each iteration
plt.figure(figsize=[15,5])
plt.xlim(0,n)
plt.ylim(0,50)
plt.xlabel('Nr. of Iterations', fontsize= 16)
plt.ylabel('Cost', fontsize=16)
plt.title('Effect of Learning Rate', fontsize=16, c= 'g')
#Getting X-axis values:
iteration_list1= list(range(0,n+1))
iteration_list= np.array(iteration_list1)
#Getting y-axis values
low_values= np.array(list_x)
plt.plot(iteration_list, g(low_values),linewidth=6,c='green')
plt.show()
First, you have to have the same array dimensions, so change this line :
iteration_list1= list(range(0,n+1))
to
iteration_list1= list(range(0,n))
Then, I think you should delete the line
plt.ylim(0,50)
to let the library set the best y scale. You see a "straight line" because the variations are very small.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.