[英]ReLU neural network only returns 0
I'm trying to print the next number of the input using ReLU activation function. 我正在尝试使用ReLU激活功能打印输入的下一个数字。 I trained the network several times but I got the output as 0.
我对网络进行了几次培训,但输出为0。
Here is the code I'm trying to implement. 这是我要实现的代码。 Can anyone tell me what I'm doing wrong?
谁能告诉我我在做什么错?
import numpy as np,random
class NeuralNetwork():
def _init_(self):
random.seed(1)
self.weights = 0.5
def relu(self,x):
for i in range(0,len(x)):
if x[i]>0:
pass
else:
x[i]=0
return x
def relu_derv(self,x):
for i in range(0,len(x)):
if x[i]>0:
x[i]=1
else:
x[i]=0
return x
def train(self,input ,output,iterations):
for i in xrange(iterations):
out = self.think(input)
error = output-out
adjustments = np.dot(input.T,error*self.relu_derv(out))
self.weights += adjustments
def think(self,input):
return self.relu(np.dot(input,self.weights))
if _name_=="__main__":
neural= NeuralNetwork()
print "before train weights"
print neural.weights
input = np.array([1,2,3,4,5,6,7,8,9])
output = np.array([2,3,4,5,6,7,8,9,10]).T
print input
neural.train(input,output,100000)
print "after train weights"
print neural.weights
print "neural"
a=[13,15]
print neural.think(a)
The adjustment variable in the code is of large value so when increamenting weight with it, the output is 0. i just incremented weight by decreasing adjustment value and i got the output. 代码中的调整变量具有较大的值,因此当用它来减轻重量时,输出为0。我只是通过减小调整值来增加重量,然后得到输出。
self.weights += adjustments/10000
For inputs 18 and 19 got output as 19 and 20 . 对于输入18和19,输出为19和20。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.