[英]How do I implement a constant neuron in Keras?
I have the following neural network in Python/Keras: 我在Python / Keras中有以下神经网络:
input_img = Input(shape=(784,))
encoded = Dense(1000, activation='relu')(input_img) # L1
encoded = Dense(500, activation='relu')(encoded) # L2
encoded = Dense(250, activation='relu')(encoded) # L3
encoded = Dense(2, activation='relu')(encoded) # L4
decoded = Dense(20, activation='relu')(encoded) # L5
decoded = Dense(400, activation='relu')(decoded) # L6
decoded = Dense(100, activation='relu')(decoded) # L7
decoded = Dense(10, activation='softmax')(decoded) # L8
mymodel = Model(input_img, decoded)
What I'd like to do is to have one neuron in each of layers 4~7 to be a constant 1 (to implement the bias term), ie it has no input, has a fixed value of 1, and is fully connected to the next layer. 我想做的是让每个层4~7中的一个神经元成为常数1(以实现偏置项),即它没有输入,具有固定值1,并且完全连接到下一层。 Is there a simple way to do this?
有一个简单的方法吗? Thanks a lot!
非常感谢!
You could create constant input tensors: 您可以创建常量输入张量:
constant_values = np.ones(shape)
constant = Input(tensor=K.variable(constant_values))
With that said, your use case (bias) sounds like you should simply use use_bias=True
which is the default, as noted by @gionni. 话虽如此,你的用例(偏见)听起来应该只是使用
use_bias=True
这是默认值,如@gionni所述。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.