简体   繁体   English

神经网络乙状结肠功能

[英]Neural Network sigmoid function

I'm trying to make a neural network and I have a couple of questions: 我正在尝试建立神经网络,但有两个问题:

My sigmoid function is like some 我的乙状结肠功能就像

s = 1/(1+(2.7183**(-self.values)))
if s > self.weight:
        self.value = 1
    else:
        self.value = 0

The self.values is a array of the connected nodes, for instance the HNs(hidden nodes) in the HL(hidden layer) 1 is connected to all input nodes, so it's self.values is sum(inputnodes.values). self.values是连接的节点的数组,例如HL(隐藏层)1中HN(隐藏节点)已连接到所有输入节点,因此其self.values是sum(inputnodes.values)。

The HNs in the HL2 is connected to all HNs in HL1, and it's self.values is sum(HL.values) HL2中HN连接到HL1中的所有HN,其self.values为sum(HL.values)

The problem is, every node is getting the value of 1, no mather their weights(unless it's too high, like 0.90~0.99) 问题是,每个节点的值都为1,而不是权重(除非它太高,如0.90〜0.99)

My Neural Network is set like so: 我的神经网络设置如下:

(inputs, num_hidden_layers, num_hidden_nodes_per_layer, num_output_nodes) inputs is a list of binary values: (输入,num_hidden_​​layers,num_hidden_​​nodes_per_layer,num_output_nodes)输入是二进制值的列表:

Here's a log that shows this behavior. 这是显示此行为的日志。

>>NeuralNetwork([1,0,1,1,1,0,0],3,3,1)# 3 layers, 3 nodes each, 1 output
Layer1
Node: y1 Sum: 4, Sigmoid: 0.98, Weight: 0.10, self.value: 1
Node: y2 Sum: 4, Sigmoid: 0.98, Weight: 0.59, self.value: 1
Node: y3 Sum: 4, Sigmoid: 0.98, Weight: 0.74, self.value: 1
Layer2
Node: y1 Sum: 3, Sigmoid: 0.95, Weight: 0.30, self.value: 1
Node: y2 Sum: 3, Sigmoid: 0.95, Weight: 0.37, self.value: 1
Node: y3 Sum: 3, Sigmoid: 0.95, Weight: 0.80, self.value: 1
Layer3
Node: y1 Sum: 3, Sigmoid: 0.95, Weight: 0.70, self.value: 1
Node: y2 Sum: 3, Sigmoid: 0.95, Weight: 0.56, self.value: 1
Node: y3 Sum: 3, Sigmoid: 0.95, Weight: 0.28, self.value: 1

Even if I try using float points in the input it turns out the same: 即使我尝试在输入中使用浮点数,结果也一样:

>>NeuralNetwork([0.64, 0.57, 0.59, 0.87, 0.56],3,3,1)
Layer1
Node: y1 Sum: 3.23, Sigmoid: 0.96, Weight: 0.77, self.value: 1
Node: y2 Sum: 3.23, Sigmoid: 0.96, Weight: 0.45, self.value: 1
Node: y3 Sum: 3.23, Sigmoid: 0.96, Weight: 0.83, self.value: 1
Layer2
Node: y1 Sum: 3, Sigmoid: 0.95, Weight: 0.26, self.value: 1
Node: y2 Sum: 3, Sigmoid: 0.95, Weight: 0.39, self.value: 1
Node: y3 Sum: 3, Sigmoid: 0.95, Weight: 0.53, self.value: 1
Layer3
Node: y1 Sum: 3, Sigmoid: 0.95, Weight: 0.43, self.value: 1
Node: y2 Sum: 3, Sigmoid: 0.95, Weight: 0.52, self.value: 1
Node: y3 Sum: 3, Sigmoid: 0.95, Weight: 0.96, self.value: 0

Note de Node y3 in the layer3, the only one that returned a 0 after the sigmoid 注意在第3层中的节点y3,这是唯一在S形之后返回0的节点

What am I doing wrong? 我究竟做错了什么?

Also, is it really necessary to connect every node with every other node in the previous layer? 另外,是否真的有必要将每个节点与上一层中的每个其他节点连接起来? Isn't it better to let it be random? 让它随机是不是更好?

EDIT: Forgot to mention, this is a in-development NN, I'll be using a genetic algorithm to train the network. 编辑:忘记了,这是一个正在开发的NN,我将使用遗传算法来训练网络。

EDIT2: 编辑2:

class NeuralNetwork:
    def __init__(self, inputs, num_hidden_layers, num_hidden_nodes_per_layer, num_output):
        self.input_nodes = inputs
        self.num_inputs = len(inputs)
        self.num_hidden_layers = num_hidden_layers
        self.num_hidden_nodes_per_layer = num_hidden_nodes_per_layer
        self.num_output = num_output

        self.createNodes()
        self.weights = self.getWeights()
        self.connectNodes()
        self.updateNodes()

    def createNodes(self):
        self._input_nodes = []
        for i, v in enumerate(self.input_nodes):
            node = InputNode("x"+str(i+1),v)
            self._input_nodes.append(node)

        self._hidden_layers = []
        for n in xrange(self.num_hidden_layers):
            layer = HiddenLayer("Layer"+str(n+1),self.num_hidden_nodes_per_layer)
            self._hidden_layers.append(layer)

    def getWeights(self):
        weights = []
        for node in self._input_nodes:
            weights.append(node.weight)

        for layer in self._hidden_layers:
            for node in layer.hidden_nodes:
                weights.append(node.weight)
        return weights

    def connectNodes(self):
        for i,layer in enumerate(self._hidden_layers):
            for hidden_node in layer.hidden_nodes:
                if i == 0:
                    for input_node in self._input_nodes:
                        hidden_node.connections.append(input_node)
                else:
                    for previous_node in self._hidden_layers[i-1].hidden_nodes:
                            hidden_node.connections.append(previous_node)

    def updateNodes(self):
        for layer in self._hidden_layers:
            for node in layer.hidden_nodes:
                node.updateValue()

And here's the updateValue() method of the nodes: 这是节点的updateValue()方法:

def updateValue(self):
    value = 0
    for node in self.connections:
        value += node.value
    self.sigmoid(value) # the function at the beginning of the question.

The nodes created just have value, name, and weight(random at start). 创建的节点仅具有值,名称和权重(开始时是随机的)。

You are mashing together several different NN concepts. 您正在将几种不同的NN概念融合在一起。

The logistic function (which is the generalized form of the sigmoid) already serves as a threshold. 逻辑函数(S型的广义形式)已经用作阈值。 Specifically, it is a differentiable threshold which is essential for the backpropagation learning algorithm. 具体来说,这是一个微分阈值,对于反向传播学习算法至关重要。 So you don't need that piecewise threshold function (if statement). 因此,您不需要分段阈值函数(if语句)。

The weights are analogues for synaptic strength and are applied during summation (or feedforward propagation). 权重是突触强度的类似物,在求和(或前馈传播)期间应用。 So each connection between a pair of nodes has a weight that is multiplied by the sending node's activation level (the output of the threshold function). 因此,一对节点之间的每个连接的权重乘以发送节点的激活级别(阈值函数的输出)。

Finally, even with these changes, a fully-connected neural network with all positive weights will probably still produce all 1's for the output. 最后,即使进行了这些更改,具有所有正权重的完全连接的神经网络仍可能会为输出产生全1。 You can either include negative weights corresponding to inhibitory nodes, or reduce connectivity significantly (eg with a 0.1 probability that a node in layer n connects to a node in layer n+1). 您可以包括与抑制节点相对应的负权重,也可以显着降低连接性(例如,第n层中的节点连接到第n + 1层中的节点的概率为0.1)。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM