简体   繁体   中英

ReLU neural network only returns 0

I'm trying to print the next number of the input using ReLU activation function. I trained the network several times but I got the output as 0.

Here is the code I'm trying to implement. Can anyone tell me what I'm doing wrong?

import numpy as np,random

class NeuralNetwork():

    def _init_(self):
       random.seed(1)
       self.weights = 0.5

    def relu(self,x):
       for i in range(0,len(x)):
          if x[i]>0:
             pass
          else:
             x[i]=0
       return x

    def relu_derv(self,x):
       for i in range(0,len(x)):
          if x[i]>0:
             x[i]=1
          else:
             x[i]=0
       return x

    def train(self,input ,output,iterations):
       for i in xrange(iterations):
          out = self.think(input)
          error = output-out
          adjustments = np.dot(input.T,error*self.relu_derv(out))
          self.weights += adjustments

    def think(self,input):
       return self.relu(np.dot(input,self.weights))


if _name_=="__main__":
neural= NeuralNetwork()
print "before train weights"
print neural.weights
input = np.array([1,2,3,4,5,6,7,8,9])
output = np.array([2,3,4,5,6,7,8,9,10]).T
print input
neural.train(input,output,100000)

print "after train weights"
print neural.weights
print "neural"
a=[13,15]
print neural.think(a)

The adjustment variable in the code is of large value so when increamenting weight with it, the output is 0. i just incremented weight by decreasing adjustment value and i got the output.

 self.weights += adjustments/10000

For inputs 18 and 19 got output as 19 and 20 .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM