简体   繁体   中英

How to dropout entire hidden layer in a neural network?

I am trying to build a neural network in tensorflow 2.0. There I want to dropout the whole hidden layer with a probability not any single node with a certain probability. Can anyone please tell me how to dropout the entire layer in tensorflow 2.0?

Use the noise_shape argument of the Dropout layer to be [1] * n_dim of the input. Let's say the input tensor is 2D:

import tensorflow as tf

x = tf.ones([3,5])
<tf.Tensor: shape=(3, 5), dtype=float32, numpy=
array([[1., 1., 1., 1., 1.],
       [1., 1., 1., 1., 1.],
       [1., 1., 1., 1., 1.]], dtype=float32)>

noise_shape should be [1, 1] .

tf.nn.dropout(x, rate=.5, noise_shape=[1, 1])

Then randomly it will give either these as weights:

<tf.Tensor: shape=(3, 5), dtype=float32, numpy=
array([[2., 2., 2., 2., 2.],
       [2., 2., 2., 2., 2.],
       [2., 2., 2., 2., 2.]], dtype=float32)>
<tf.Tensor: shape=(3, 5), dtype=float32, numpy=
array([[0., 0., 0., 0., 0.],
       [0., 0., 0., 0., 0.],
       [0., 0., 0., 0., 0.]], dtype=float32)>

You can test it like this with a Keras layer:

tf.keras.layers.Dropout(rate=.5, noise_shape=[1, 1])(x, training=True)

If you use it in a model, just remove the training argument, and make sure you manually specify the noise_shape .

Something like this should work, although I haven't tested it:

class SubclassedModel(tf.keras.Model):
    def __init__(self):
        super(SubclassedModel, self).__init__()
        self.dense = tf.keras.layers.Dense(4)

    def call(self, inputs, training=None, mask=None):
        noise_shape = tf.ones(tf.rank(inputs))
        x = tf.keras.layers.Dropout(rate=.5, 
                                    noise_shape=noise_shape)(inputs, training=training)
        x = self.dense(x)
        return x

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM