简体   繁体   English

Tensorflow:将手动构建图层转换为tf.contrib.layers

[英]Tensorflow: Transforming manually build layers to tf.contrib.layers

I have these four layers defined: 我定义了这四个层次:

layer_1 = tf.add(
    tf.matmul(input, tf.Variable(tf.random_normal([n_input, n_hidden_1])),
    tf.Variable(tf.random_normal([n_hidden_1]))))
layer_2 = tf.nn.sigmoid(tf.add(
    tf.matmul(layer_1, tf.Variable(tf.random_normal([n_hidden_1, n_hidden_2])),
    tf.Variable(tf.random_normal([n_hidden_2]))))
layer_3 = tf.nn.sigmoid(tf.add(
    tf.matmul(layer_2, tf.Variable(tf.random_normal([n_hidden_2, n_hidden_1])))),
    tf.Variable(tf.random_normal([n_hidden_1]))))
layer_4 = tf.add(
    tf.matmul(layer_3, tf.Variable(tf.random_normal([n_hidden_1, n_input]))),
    tf.Variable(tf.random_normal([n_input])))

I would like to transform this code into code based on tf.contrib.layers . 我想将此代码转换为基于tf.contrib.layers代码。 So far I got 到目前为止我得到了

layer_1 = tf.contrib.layers.fully_connected(
    inputs=input,
    num_outputs=n_hidden_1,
    activation_fn=None)
layer_2 = tf.contrib.layers.fully_connected(
    inputs=layer_1,
    num_outputs=n_hidden_2,
    activation_fn=tf.nn.sigmoid)
layer_3 = tf.contrib.layers.fully_connected(
    inputs=layer_2,
    num_outputs=n_hidden_1,
    activation_fn=tf.nn.sigmoid)
layer_4 = tf.contrib.layers.linear(
    inputs=layer_3,
    num_outputs=n_input)

by reading up on https://www.tensorflow.org/versions/master/tutorials/layers/ and https://www.tensorflow.org/api_docs/python/tf/contrib/layers/fully_connected . 阅读https://www.tensorflow.org/versions/master/tutorials/layers/https://www.tensorflow.org/api_docs/python/tf/contrib/layers/fully_connected I read in https://www.tensorflow.org/api_guides/python/contrib.layers#Higher_level_ops_for_building_neural_network_layers that tf.contrib.layers.linear is an alternative for the linear layer. 我在https://www.tensorflow.org/api_guides/python/contrib.layers#Higher_level_ops_for_building_neural_network_layers中读到, tf.contrib.layers.linear是线性层的替代方案。

But my output is more different compared to what I got earlier, then that this could be by chance. 但是我的输出与我之前的输出相比更加不同,那么这可能是偶然的。 What did I do wrong in the configuration of the layers? 我在层的配置中做错了什么?

One difference between your code and the tf.contrib.layers version is that the default initializers are different: 您的代码与tf.contrib.layers版本之间的一个区别是默认初始值设定项不同:

These are generally considered to be good defaults for a fully connected layer, but you can override them with a tf.random_normal_initializer as follows: 这些通常被认为是完全连接层的良好默认值,但您可以使用tf.random_normal_initializer覆盖它们,如下所示:

layer_1 = tf.contrib.layers.fully_connected(
    inputs=input,
    num_outputs=n_hidden_1,
    activation_fn=None,
    weights_initializer=tf.random_normal_initializer(),
    biases_initializer=tf.random_normal_initializer())
# ...

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM