[英]is this the right way to apply softmax?
self.classifier = nn.Sequential(
nn.Flatten(),
nn.Linear(in_features = 32*8*8, out_features = 26),
nn.ReLU(),
nn.Linear(in_features = 26, out_features = output_shape),
nn.Softmax(dim=1)
)
and my loss fn is我的损失 fn 是
loss_fn = nn.CrossEntropyLoss()
optimizer = torch.optim.Adam(params = model_0.parameters(),
lr = 0.07)
Is that the right way to use softmax?那是使用 softmax 的正确方法吗?
output_shape
is equal to num of class (this is multi class classification) output_shape
等于num of class(这是多class分类)
If my implementation isn't wrong, then why do all of my data in 1 batch output the same class (even each data has very similar output probability)如果我的实现没有错,那么为什么我的一批数据 output 中的所有数据都与 class 相同(甚至每个数据都具有非常相似的 output 概率)
No, CrossEntropyLoss
doesn't require Softmax
as it already includes it (or actually LogSoftmax
): https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html?highlight=crossentropy#torch.nn.CrossEntropyLoss .不,
CrossEntropyLoss
不需要Softmax
,因为它已经包含它(或者实际上是LogSoftmax
): https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html?highlight=crossentropy#torch.nn.CrossEntropyLoss .
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.