简体   繁体   English

Tensorflow Dropout:如果应用两个Dropout层会怎样?

[英]Tensorflow Dropout: What happens if I apply two dropout layers?

Lets say, I am constructing a neural network like so: 可以说,我正在像这样构建一个神经网络:

x = tf.nn.conv2d(input, ...)
x = tf.nn.max_pool(x, ...)
x = tf.nn.dropout(x, keep_prob=1.)
x = tf.nn.thislayershallhavedropout(x,...)
x = tf.nn.dropout(x, keep_prob=.5)

Would this be an effective technique to tell TensorFlow to just dropout the layer thislayershallhavedropout ? 告诉TensorFlow只是退出thislayershallhavedropout的层thislayershallhavedropout

Basically, what I am trying to do is to tell TensorFlow to use dropout only on a single layer and not cascade back into earlier layers. 基本上,我想做的是告诉TensorFlow仅在单个层上使用dropout,而不是级联回到早期的层。

Dropout sets activations that pass through to 0 with a given chance. 退出将激活设置0且激活的机会为0 It's hard to give a 'layer' dropout, as you are only setting connections to 0 or to 1 with a given chance. 很难给出“层”的丢失,因为您仅在给定机会下将连接设置为01

If you want to give the outgoing connections from a certain layers dropout, you should do: 如果要为某个特定层的出站连接提供传出连接,则应该执行以下操作:

x = tf.nn.thislayershallhavedropout(x,...)
x = tf.nn.dropout(x, keep_prob=.5)

Which you have basically done. 您基本上已经完成了。 So 50% percent of the activations coming from thislayershallhavedropout will be disactivated. 因此,来自该层的所有激活中有50%的激活将被thislayershallhavedropout

By the way, as was pointed out in the comments, setting keep_prob to 1 has no effect at all: this will let all activations pass through like normal. 顺便说一句,正如注释中指出的那样,将keep_prob设置为1完全没有效果:这将使所有激活都像正常情况一样通过。

x = tf.nn.dropout(x, keep_prob=1.)

Keep in mind: dropout might not directly interfere with previous layers, however, during backpropagation, weights of previous and succesive layers will adapt to half the activations being disabled. 请记住:退出可能不会直接干扰先前的层,但是,在反向传播期间,先前层和成功层的权重将适应禁用的激活的一半。 Thus, there is no way you can prevent dropout having an (indirect) effect on previous layers. 因此,您无法防止辍学对先前的图层产生(间接)影响。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM