简体   繁体   中英

What is the difference between dropout layer and dropout parameter in any keras layer

What is the difference between the Dropout layer and the dropout and recurrent_droput parameters in keras? Do they all serve the same purpose?

Example:

model.add(Dropout(0.2))  # layer
model.add(LSTM(100, dropout=0.2, recurrent_dropout=0.2))  # parameters

Yes they have the same functionality, dropout as a parameter is used before linear transformations of that layer (multiplication of weights and addition of bias). Dropout as layer can be used before an activation layer too.

recurrent_dropout also has same functionality but different direction(usually dropouts are between input and output, it is between timestamps)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM