简体   繁体   English

如何创建具有本地连接层和密集父层的本地连接层?

[英]How can I create a locally-connected layer with both locally-connected and dense parent layers?

I currently have a network ( image of toy example ) with two input layers. 我目前有一个带有两个输入层的网络( 玩具示例的图像 )。 in1 is just a short, flat vector of values but in2 is a 27-channel image. in1只是值的简短平面向量,而in2是27通道图像。 I want my network to be structured on locally-connected layers but I don't know a good way to sprinkle in in1 's data with in2 . 我希望将网络构建在本地连接的层上,但是我不知道将in1的数据撒入in2的好方法。 I currently flatten in2 's branch after a few layers, merge with in1 , and add dense layers onward. 目前,我将in2的分支平整了几层,与in1合并,然后继续添加密集的层。

How can I densely introduce in1 's data while maintaining the locally-connected architecture? 如何在保持本地连接体系结构的同时密集引入in1的数据? The image linked above shows this goal with a red arrow. 上面链接的图像用红色箭头显示了此目标。

One possible solution that I came up with is to copy in1 's vector as channels to in2 such that in2 's dimension would be width * height * (num_original_channels + len(in1) ) . 我想出的一种可能的解决方案是将in1的向量作为通道复制到in2 ,以使in2的尺寸为width * height * (num_original_channels + len(in1) ) This seems inelegant because it would be copying in1 many times. 这似乎不雅,因为它会被复制in1 多次 There must be a better way. 一定会有更好的办法。

I'm new to keras so please pardon my shaky vocabulary. 我是keras的新手,所以请原谅我摇摇欲坠的词汇。 Also, this is a toy example just to illustrate my idea so there may be some other/unrelated architectural criticisms. 另外,这只是一个玩具示例,仅用于说明我的想法,因此可能会有其他/无关的建筑批评。

Thanks in advance for any advice! 在此先感谢您的任何建议!

fwiw, here is the code I am using: 首先,这是我正在使用的代码:

input1 = Input( ... ) #small flat vec
input2 = Input( ... ) #deep picture

pre = Reshape( ... )( input2 )
l1 = LocallyConnected2D( ... )( pre )
l2 = LocallyConnected2D( ... )( l1 )
l3 = LocallyConnected2D( ... )( l2 )
flat = Flatten( ... )( l3 )
merge = tensorflow.keras.layers.concatenate( [flat, input1], ... )
l4 = Dense( ... )( merge )
l5 = Dense( ... )( l4 )
output = Dense( ... )( l5 )

Answering my own question here. 在这里回答我自己的问题。 It seems like the best solution is to have both input1 and input2 create two separate tensors with the same layer and no activation funciton, sum them together, then add the activation. 似乎最好的解决方案是让input1和input2都创建两个具有相同层且没有激活函数的单独的张量,将它们加在一起,然后添加激活。

Using my example from before it would look something like this: 使用之前的示例,它看起来像这样:

(I'm adding example dimensions to hopefully clarify what I mean. They're made up from thin air) (我正在添加示例尺寸以希望阐明我的意思。它们是凭空制作的)

input1 = Input( ... ) #small flat vec, 1x200
input2 = Input( ... ) #deep picture,   50x50x10

l1 = LocallyConnected2D( activation=None, ... )( input2 ) # 40x40x5

num_elements = 40 * 40 * 5
d1 = Dense( units=num_elements, activation=None, ... )( input1 ) # 1x8000
d1_3D = Reshape( target_shape=(40, 40, 5,) )( d1 ) #40x40x5

merge = Add()([ l1, d1_3D ]) #40x40x5
l2 = LeakyReLU( ... )( merge ) #Or whatever activation function you want, 40x40x5

# ...

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何在纯Numpy中实现本地连接的层 - How can I implement locally connected layer in pure Numpy Keras中局部连接层的尺寸 - Dimensions of locally connected layers in Keras TensorFlow CNN教程:如何编辑顶层以进行本地连接? - TensorFlow CNN Tutorial: How to edit top layer to be locally connected? 如何从使用 tf.contrib.layers.fully_connected 创建的层访问权重? - How can I access weights from a layer created with tf.contrib.layers.fully_connected? 使用 python 中的 keras 在输入矩阵的列上的局部连接层 - locally connected layer on the columns of an input matrix using keras in python tf.contrib.layer.fully_connected, tf.layers.dense, tf.contrib.slim.fully_connected, tf.keras.layers.Dense 之间的不一致 - Inconsistencies between tf.contrib.layer.fully_connected, tf.layers.dense, tf.contrib.slim.fully_connected, tf.keras.layers.Dense Keras-检索与图层连接的图层 - Keras - Retrieve layers that the layer connected to 如何通过 Dart/Flutter 中的主机名获取本地连接设备的 IP 地址? - How do I get the IP address of a locally connected device by its hostname in Dart/Flutter? 将层连接到完全连接的层Tensorflow - concatenate layers to a fully connected layer Tensorflow Keras 中的自定义单连接“密集”层导致崩溃 - Custom Single Connected "dense" Layer in Keras causing crash
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM