简体   繁体   English

在tensorflow代码中使用keras层

[英]Use keras layer in tensorflow code

Lets say I have a simple neural network with an input layer and a single convolution layer programmed in tensorflow: 假设我有一个简单的神经网络,其输入层和在tensorflow中编程的单个卷积层:

  # Input Layer
  input_layer = tf.reshape(features["x"], [-1, 28, 28, 1])

  # Convolutional Layer #1
  conv1 = tf.layers.conv2d(
      inputs=input_layer,
      filters=32,
      kernel_size=[5, 5],
      padding="same",
      activation=tf.nn.relu)

I leave out any further parts of the network definitions for the features . 我省略了features的网络定义的任何其他部分。

If I wanted to add an LSTM Layer after this convolution layer, I would have to make the convolution layer TimeDistributed (in the language of keras) and then put the output of the TimeDistributed layer into the LSTM. 如果我想在此卷积层之后添加LSTM层,我将必须使卷积层TimeDistributed (以keras的语言),然后将TimeDistributed层的输出放入LSTM。

Tensorflow offers access to the keras layers in tf.keras.layers . Tensorflow提供对tf.keras.layers中 keras层的访问 Can I use the keras layers directly in the tensorflow code? 我可以直接在tensorflow代码中使用keras层吗? If so, how? 如果是这样,怎么样? Could I also use the tf.keras.layers.lstm for the implementation of the LSTM Layer? 我是否也可以使用tf.keras.layers.lstm来实现LSTM层?

So in general: Is a mixture of pure tensorflow code and keras code possible and can I use the tf.keras.layers? 所以一般来说:纯粹的张量流代码和keras代码的混合是否可能,我可以使用tf.keras.layers吗?

Yes, this is possible. 是的,这是可能的。

Import both TensorFlow and Keras and link your Keras session to the TF one: 导入TensorFlow和Keras并将您的Keras会话链接到TF:

import tensorflow as tf
import keras
from keras import backend as K

tf_sess = tf.Session()
K.set_session(tf_sess)

Now, in your model definition, you can mix TF and Keras layers like so: 现在,在您的模型定义中,您可以混合TF和Keras层,如下所示:

# Input Layer
input_layer = tf.reshape(features["x"], [-1, 28, 28, 1])

# Convolutional Layer #1
conv1 = tf.layers.conv2d(
    inputs=input_layer,
    filters=32,
    kernel_size=[5, 5],
    padding="same",
    activation=tf.nn.relu)

# Flatten conv output
flat = tf.contrib.layers.flatten(conv1)

# Fully-connected Keras layer
layer2_dense = keras.layers.Dense(128, activation='relu')(flat)

# Fully-connected TF layer (output)
output_preds = tf.layers.dense(layer2_dense, units=10)

This answer is adopted from a Keras blog post by Francois Chollet. 这个答案来自Francois Chollet 的Keras博客文章。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM