简体   繁体   English

零作为神经网络输入

[英]Zeros as Neural Network Input

Currently I'm attempting to create a three layer neural network.When I began attempting training for XOR, this though crossed my mind: 目前我正在尝试创建一个三层神经网络。当我开始尝试XOR训练时,这虽然让我想到了:

double NewWeight(double oldWeight){
return oldWeight+(MeanSquaredError*input*learningRate);
}

This is the formula for a new weight according to http://natureofcode.com/book/chapter-10-neural-networks/ 根据http://natureofcode.com/book/chapter-10-neural-networks/,这是新重量的公式

First,if I have an input of zero regardless of error the weight will remain the same. 首先,如果我输入零而不管错误,重量将保持不变。 Is this solved using a bias? 这是用偏见解决的吗?

Secondly, Neural networks often have more than two inputs (such as in XOR). 其次,神经网络通常具有两个以上的输入(例如在XOR中)。 In that case, would you need to add the two inputs? 在这种情况下,您需要添加两个输入吗? Or perhaps find the mean of the weight with separate inputs? 或者也许通过单独的输入找到重量的平均值?

If you suggest I use a different new weight function, please don't post an equation without explaining the symbols behind it. 如果您建议我使用不同的新权重函数,请不要在不解释其背后的符号的情况下发布等式。 Thanks! 谢谢!

First, the bias does not change anything. 首先,偏见不会改变任何事情。 Usually, the bias is realised by an additional input with a constant 1 and a weight as bias. 通常,偏差通过额外输入实现,其中常数1和权重作为偏差。 See https://en.wikipedia.org/wiki/Perceptron#Definitions . 请参阅https://en.wikipedia.org/wiki/Perceptron#Definitions

Second, you calculate the weights for each edge in your network. 其次,您计算网络中每条边的权重。 So, if you have two inputs, you calculate the weights for each of them. 因此,如果您有两个输入,则计算每个输入的权重。

I would say that if you have 0 as input, you have no information. 我会说,如果您输入0,则表示您没有任何信息。 With no information you cannot tell how to change a weight. 没有信息你无法告诉如何改变体重。 Your function is absolute correct for back-propagation. 您的函数对于反向传播是绝对正确的。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM