简体   繁体   English

mxnet:具有共享掩码的多个辍学层

[英]mxnet: multiple dropout layers with shared mask

I'd like to reproduce a recurrent neural network where each time layer is followed by a dropout layer, and these dropout layers share their masks. 我想重现一个递归神经网络,其中每个时间层后面都有一个辍学层,这些辍学层共享其掩码。 This structure was described in, among others, A Theoretically Grounded Application of Dropout in Recurrent Neural Networks . 除其他外, 在递归神经网络中理论上的辍学应用中描述了这种结构。

As far as I understand the code, the recurrent network models implemented in MXNet do not have any dropout layers applied between time layers; 据我所了解的代码,在MXNet中实现的循环网络模型在时间层之间没有应用任何退出层。 the dropout parameter of functions such as lstm ( R API , Python API ) actually defines dropout on the input. dropout的功能,诸如参数lstmřAPIPython的API )实际上定义在输入降。 Therefore I'd need to reimplement these functions from scratch. 因此,我需要从头开始重新实现这些功能。

However, the Dropout layer does not seem to take a variable that defines mask as a parameter. 但是,Dropout层似乎没有采用将mask定义为参数的变量。

Is it possible to make multiple dropout layers in different places of the computation graph, yet sharing their masks? 是否可以在计算图的不同位置创建多个辍学层,但共享其掩码?

根据此处的讨论,不可能指定掩码,并且使用随机种子不会影响dropout的随机数生成器。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM