简体   繁体   English

Pytorch 神经网络错误

[英]Pytorch Neural Network Errors

I am trying to compute the loss and accuracy of a certain machine learning model by using Pytorch and I am having trouble initializing the dataset so that it can run.我正在尝试使用 Pytorch 计算某个机器学习 model 的损失和准确性,但我无法初始化数据集以便它可以运行。 Using the Moon dataset, I am getting a few errors when I run the code.使用 Moon 数据集,我在运行代码时遇到了一些错误。 I first initialize the dataset:我首先初始化数据集:

X, y = make_moons(200, noise=0.2, random_state=42)

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.30, random_state=1, stratify = y)
x, y = Variable (torch.from_numpy(X_train)).float(), Variable(torch.from_numpy(y_train)).float()

and then when I run the Neural Network:然后当我运行神经网络时:

    def __init__(self):
        super(SoftmaxRegression, self).__init__()
        self.fc = nn.Linear(200, 1)
        self.softmax = nn.Softmax()

    def forward(self, x):
        x = self.fc(x)
        x = self.softmax(x)
        return x

I get the following errors: serWarning: Implicit dimension choice for softmax has been deprecated.我收到以下错误:serWarning:softmax 的隐式维度选择已被弃用。 Change the call to include dim=X as an argument.更改调用以包含 dim=X 作为参数。 x = F.softmax(self.layer(x)) x = F.softmax(self.layer(x))
ret = torch._C._nn.nll_loss(input, target, weight, _Reduction.get_enum(reduction), ignore_index) IndexError: Target 1 is out of bounds. ret = torch._C._nn.nll_loss(input, target, weight, _Reduction.get_enum(reduction), ignore_index) IndexError: Target 1 is out of bounds。

How can I fix this so that it can run the dataset and output the loss and accuracy?我该如何解决这个问题,以便它可以运行数据集和 output 的损失和准确性?

(Sorry to put this as an answer but unfortunately stack overflow won't let me comment:/). (很抱歉将此作为答案,但不幸的是堆栈溢出不会让我评论:/)。

Even if the Softmax worked it is absolutely pointless.(Unless you are softmaxing across your batch but that would be really weird).即使 Softmax 工作,它也绝对没有意义。(除非你在你的批次中进行 softmaxing,但这真的很奇怪)。 Your code shows you have a linear layer going from a tensor of 200 to 1. Softmax on a single value will simply return that value, Softmaxing should only be used on 2 or more values.您的代码显示您有一个从 200 到 1 的张量的线性层。单个值上的 Softmax 只会返回该值,Softmaxing 只能用于 2 个或更多值。

If you wish to do binary classification I would instead change the code to be this:如果您希望进行二进制分类,我会将代码更改为:

    import torch.nn.functional as F

    def forward(self, x):
        x = self.fc(x)
        x = F.sigmoid(x)
        return x

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM