简体   繁体   English

将层连接到完全连接的层Tensorflow

[英]concatenate layers to a fully connected layer Tensorflow

I am trying to implement a siamese network from Sergey Zagoruyko using Tensorflow 我正在尝试使用Tensorflow从Sergey Zagoruyko实施暹罗网络

http://www.cv-foundation.org/openaccess/content_cvpr_2015/papers/Zagoruyko_Learning_to_Compare_2015_CVPR_paper.pdf http://www.cv-foundation.org/openaccess/content_cvpr_2015/papers/Zagoruyko_Learning_to_Compare_2015_CVPR_paper.pdf

I don't know to concatenate the 2 input layers to a top network (fully connected layer + relu + fully connected layer) 我不知道将2个输入层连接到顶级网络(完全连接的层+ relu +完全连接的层)

This may not be what you are looking for, but I recommend trying Keras . 这可能不是您想要的,但是我建议尝试Keras It is a flexible, high-level framework built on tensorflow that makes it extremely easy to accomplish what you are attempting. 这是一个基于tensorflow的灵活,高级的框架,可以非常轻松地完成您要尝试的工作。 This would be how you could do it in Keras (with 32 inputs and 32 neurons in your FC layers). 这就是您可以在Keras中实现的方式(在FC层中具有32个输入和32个神经元)。

from keras.models import Sequential
from keras.layers import Dense, Activation, Input

model = Sequential()
model.add(Input(shape=(32,)))
model.add(Dense(32))
model.add(Activation("relu"))
model.add(Dense(32))

Alternatively, using just tensorflow you could use this strategy. 或者,仅使用张量流,就可以使用此策略。

import tensorflow as tf    

x = tf.placeholder(shape, dtype=dtype)
y = tf.layers.dense(x, 32)
y = tf.nn.relu(y)
y = tf.layers.dense(y, 32)

But I personally think keras is more elegant, plus it adds a whole lot of more useful features, such as model.output, model.input, and much more. 但是我个人认为keras更优雅,此外它还添加了许多更有用的功能,例如model.output,model.input等。 In fact, keras has recently been built into tensorflow's contrib module as tf.contrib.keras. 实际上,keras最近已内置到tensorflow的contrib模块中,称为tf.contrib.keras。 Hope that helps! 希望有帮助!

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM