简体   繁体   中英

Dropout Layer with zero dropping rate

I'm having trouble understanding a certain aspect of dropout layers in PyTorch.

As stated in thePytorch Documentation the method's signature is torch.nn.Dropout(p=0.5, inplace=False) where p is the dropout rate.

What does this layer do when choosing p=0 ? Does it change its input in any way?

Dropout with p=0 is equivalent to the identity operation.

In fact, this is the exact behaviour of Dropout modules when set in eval mode:

During evaluation the module simply computes an identity function.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM