简体   繁体   English

如何实现以非全连接层作为最后一层的神经网络?

[英]How to implement a neural network with a not-fully-connected layer as the final layer?

I would like to implement a neural.network with an input layer, two dense hidden layer and a non-dense output layer.我想实现一个带有输入层、两个密集隐藏层和一个非密集 output 层的神经网络。 A toy example is shown in the figure below.下图显示了一个玩具示例。 The first hidden layer has three neurons, the second two and the final four neurons but between the second and third there are only four connections.第一个隐藏层有三个神经元,第二个和最后四个神经元,但第二个和第三个之间只有四个连接。

网络架构

I would like to use Keras functional API. How can I implement it?我想使用 Keras functional API。我该如何实现呢? Should I set the missing weight manually to 0?我应该手动将丢失的重量设置为 0 吗? I would start as follows:我会开始如下:

input=keras.layers.Input(...)
hidden1=keras.layers.Dense(3, activation="..")(input)
hidden2=keras.layers.Dense(3, activation="..")(hidden1)

but then I do not know how to proceed.但后来我不知道如何进行。

The final layer is actually two separate Dense layers, each with 2 neurons and connected to a different neuron of previous layer.最后一层实际上是两个独立的Dense层,每个层有 2 个神经元并连接到前一层的不同神经元。 Therefore, you can simply separate the neurons of second-to-last layer and pass it to two different layers:因此,您可以简单地将倒数第二层的神经元分开,并将其传递给两个不同的层:

input = keras.layers.Input(shape=(3,))
hidden1 = keras.layers.Dense(3)(input)
hidden2 = keras.layers.Dense(2)(hidden1)
hidden2_n1 = keras.layers.Lambda(lambda x: x[:,0:1])(hidden2)  # take the first neuron
hidden2_n2 = keras.layers.Lambda(lambda x: x[:,1:])(hidden2)   # take the second neuron
output1 = keras.layers.Dense(2)(hidden2_n1)
output2 = keras.layers.Dense(2)(hidden2_n2)
output = keras.layers.concatenate([output1, output2])  # optional: concatenate the layers to have a single output layer

model = keras.models.Model(input, output)

In tf.keras or newer versions of keras , instead of using Lambda layers you could simply write:tf.keras或更新版本的kerasLambda您可以简单地编写:

output1 = keras.layers.Dense(2)(hidden2[:,0:1])
output2 = keras.layers.Dense(2)(hidden2[:,1:])

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 为什么在最终的 softmax 层之前移除全连接层后,我的卷积神经网络的准确率会提高? - Why does the accuracy of my convolutional neural network increase after removing the fully connected layer before the final softmax layer? 在PyTorch中实现对完全连接层的辍学 - Implement dropout to fully connected layer in PyTorch PyTorch中如何高效实现非全连接线性层? - How to efficiently implement a non-fully connected Linear Layer in PyTorch? 如何在Tensorflow的神经网络层中实现不同的激活功能? - How to implement different activation functions in a layer of a neural network in Tensorflow? 全连接层大小 - fully connected layer size 张量流中未完全连接的层 - Not fully connected layer in tensorflow 如何在输入层和指定的第一个隐藏层之间具有固定对应关系的情况下实现神经网络模型? - How to implement a neural network model, with fixed correspondence between the input layer and the first hidden layer specified? 为什么即使输入的维数较大,keras神经网络中最后一个完全连接/密集的层也希望具有2个暗角? - Why does the last fully-connected/dense layer in a keras neural network expect to have 2 dim even if its input has more dimensions? 神经网络中的隐藏层 - Hidden Layer in a Neural Network 使用Tensorflow中的MNIST上的一个隐藏层来训练完全连接的网络 - Training a fully connected network with one hidden layer on MNIST in Tensorflow
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM