![](/img/trans.png)
[英]Is there a way to add constraints to a neural network output but still with softmax activation function?
[英]Neural Network with Softmax Output
我目前正在學習多層感知的編碼。 對於這個MLP,我嘗試對我的隱藏層使用logistic Sigmoidal,對我的輸出使用Softmax,並假設有兩個類標簽。
import theano
from theano import tensor as T
import numpy as np
import matplotlib.pyplot as plt
alpha = 0.1
#Alpha value
alpha = 2*alpha/2
no_iters = 1 #Trying to get 1 iteration to work first.
#Weight matrix to hidden layer (2 input into 2 neuron)
w_h = np.array([ [1.0, 2.0],
[-2.0, 0.0] ])
#Bias to hidden layer need ( 2 Hidden Layer neurons)
b_h = np.array([3.0, -1])
#Weight matrix to output layer (2 input into 1 neuron)
w_o = np.array([[1.0],
[1.0]])
#Bias to output layer (Only 1 bias for one output neuron)
b_o = np.array([-2.0])
# X Input Array (No of data rows, No of inputs)
x = np.array([[1.0, 2.0],
[-2.0, 3.0]])
#Desired Outputs(2 data row = 2 desired output (Rows))
d = np.array([[0.0],
[1.0]])
#Assume 2 class labels for the 2 data rows
k = np.array([[1.0, 0.0],
[0.0, 1.0]])
for iter in range(no_iters):
#Hidden Layer Functions
s = np.dot(x,w_h)+ b_h
z = 1.0/(1 + np.exp(-s))
#Output Layer Functions (Softmax)
u = np.dot(z, w_o)+b_o
u_max = np.max(u, axis=1, keepdims=True)
p = np.exp(u-u_max)/np.sum(np.exp(u-u_max), axis=1, keepdims=True)
y = np.argmax(p, axis=1)
#SoftMax Delta O
delta_o = k - p
#Delta for input layer (DZ = differentiation of function)
dz = z*(1-z)
delta_h = np.dot(delta_o, np.transpose(w_o))*dz
#Assign new weight and bias to output layer
dw = -np.dot(np.transpose(z),delta_o)
db = -np.sum(delta_o, axis=0)
w_o = w_o - dw * alpha
b_o = b_o - db * alpha
#Assign new weight and bias to hidden layer
w_h = w_h + alpha*np.dot(np.transpose(x), delta_h)
b_h = b_h + alpha*np.sum(np.transpose(delta_h), axis=1)
print(z)
print(y)
執行代碼時, delta_h = np.dot(delta_o, np.transpose(w_o))*dz
矩陣點積會出現問題。 由於delta_o是2x2矩陣,而transpose(w_o)是1x2矩陣。
我是否使用錯誤的公式來解決此問題?
您不能將兩個不同大小的張量相乘。 您可以做的是獲取誤差向量的平均值,並對權重進行逐元素修改。 這不會影響性能,並且可以解決我希望的錯誤。
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.