How to apply dropout to the following fully connected network in Pytorch:
class NetworkRelu(nn.Module):
def __init__(self):
super().__init__()
self.fc1 = nn.Linear(784,128)
self.fc2 = nn.Linear(128,64)
self.fc3 = nn.Linear(64,10)
def forward(self,x):
x = F.relu(self.fc1(x))
x = F.relu(self.fc2(x))
x = F.softmax(self.fc3(x),dim=1)
return x
Since there is functional code in the forward method, you could use functional dropout, however, it would be better to use nn.Module
in __init__()
so that the model when set to model.eval()
evaluate mode automatically turns off the dropout.
Here is the code to implement dropout:
class NetworkRelu(nn.Module):
def __init__(self):
super().__init__()
self.fc1 = nn.Linear(784,128)
self.fc2 = nn.Linear(128,64)
self.fc3 = nn.Linear(64,10)
self.dropout = nn.Dropout(p=0.5)
def forward(self,x):
x = self.dropout(F.relu(self.fc1(x)))
x = self.dropout(F.relu(self.fc2(x)))
x = F.softmax(self.fc3(x),dim=1)
return x
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.