简体   繁体   中英

Issue in Deep learning while backpropagation. (Python)

Can't seem to figure out the issue with this simple 2 layered network. The forward process seems to be error free, however, I am unable to figure out how to calculate the cost for w1, w2 and b1 which are the weights and bias for the first layer.

//forward

z1 = point[0]*w1 + point[1]*w2 +  b1
z2 = sigmoid(z1)*w3 + b2
pred = sigmoid(z2)


//backward

z2_d_cost = 2 * (pred-target)
z2_d_pred = sigmoid_p(z2)
z2_cost_pred = z2_d_cost * z2_d_pred

w3 = w3 - z2*lrate*z2_cost_pred
b2 = b2 - lrate*z2_cost_pred

z1_d_pred = sigmoid_p(z1) * z2_cost_pred * w3

w1 = w1 - point[0]*lrate*z1_d_pred
w2 = w2 - point[1]*lrate*z1_d_pred
b1 = b1 - lrate*z1_d_pred

Nvm Figured it. simple mistake, it should be w3 = w3 - z1*lrate*z2_cost_pred

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM