简体   繁体   中英

Tensorflow Dropout: What happens if I apply two dropout layers?

Lets say, I am constructing a neural network like so:

x = tf.nn.conv2d(input, ...)
x = tf.nn.max_pool(x, ...)
x = tf.nn.dropout(x, keep_prob=1.)
x = tf.nn.thislayershallhavedropout(x,...)
x = tf.nn.dropout(x, keep_prob=.5)

Would this be an effective technique to tell TensorFlow to just dropout the layer thislayershallhavedropout ?

Basically, what I am trying to do is to tell TensorFlow to use dropout only on a single layer and not cascade back into earlier layers.

Dropout sets activations that pass through to 0 with a given chance. It's hard to give a 'layer' dropout, as you are only setting connections to 0 or to 1 with a given chance.

If you want to give the outgoing connections from a certain layers dropout, you should do:

x = tf.nn.thislayershallhavedropout(x,...)
x = tf.nn.dropout(x, keep_prob=.5)

Which you have basically done. So 50% percent of the activations coming from thislayershallhavedropout will be disactivated.

By the way, as was pointed out in the comments, setting keep_prob to 1 has no effect at all: this will let all activations pass through like normal.

x = tf.nn.dropout(x, keep_prob=1.)

Keep in mind: dropout might not directly interfere with previous layers, however, during backpropagation, weights of previous and succesive layers will adapt to half the activations being disabled. Thus, there is no way you can prevent dropout having an (indirect) effect on previous layers.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM