简体   繁体   中英

concatenate layers to a fully connected layer Tensorflow

I am trying to implement a siamese network from Sergey Zagoruyko using Tensorflow

http://www.cv-foundation.org/openaccess/content_cvpr_2015/papers/Zagoruyko_Learning_to_Compare_2015_CVPR_paper.pdf

I don't know to concatenate the 2 input layers to a top network (fully connected layer + relu + fully connected layer)

This may not be what you are looking for, but I recommend trying Keras . It is a flexible, high-level framework built on tensorflow that makes it extremely easy to accomplish what you are attempting. This would be how you could do it in Keras (with 32 inputs and 32 neurons in your FC layers).

from keras.models import Sequential
from keras.layers import Dense, Activation, Input

model = Sequential()
model.add(Input(shape=(32,)))
model.add(Dense(32))
model.add(Activation("relu"))
model.add(Dense(32))

Alternatively, using just tensorflow you could use this strategy.

import tensorflow as tf    

x = tf.placeholder(shape, dtype=dtype)
y = tf.layers.dense(x, 32)
y = tf.nn.relu(y)
y = tf.layers.dense(y, 32)

But I personally think keras is more elegant, plus it adds a whole lot of more useful features, such as model.output, model.input, and much more. In fact, keras has recently been built into tensorflow's contrib module as tf.contrib.keras. Hope that helps!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM