简体   繁体   English

当我将数据加载器张量转换为 numpy 数组时,Pytorch 张量数据受到干扰

[英]Pytorch tensor data is disturbed when I convert data loader tensor to numpy array

I am using a simple train loop for a regression task.我正在使用一个简单的训练循环来执行回归任务。 To make sure that regression ground-truth values are the same as what I expect in the training loop, I decided to plot each batch of data.为了确保回归真实值与我在训练循环中的期望值相同,我决定绘制每批数据。 However, I see that when I convert the data loader's tensor to numpy array and plot it, it is disturbed.但是,我看到当我将数据加载器的张量转换为 numpy 数组并绘制它时,它会受到干扰。 I am using myTensor.data.cpu().numpy() for the conversion.我正在使用 myTensor.data.cpu().numpy() 进行转换。

My code is as below:我的代码如下:

 train_ds = TensorDataset(x_train, y_train) train_dl = DataLoader(train_ds, batch_size = 32, shuffle = True, num_workers = 0, drop_last = True) for epoch in range(epochs): model.train() for i, (x, y) in enumerate(train_dl): x = x.cuda() y = y.cuda() yy = y.data.cpu().numpy() pyplot.plot(yy[0: 32, 0]) pyplot.show()

在此处输入图片说明

I think it is because I set shuffle = True in data loader.我认为这是因为我在数据加载器中设置了 shuffle = True 。 If I set it to false, it is fine.如果我将它设置为false,那就没问题了。 However, How can I shuffle training batches after each epoch if I set shuffle = False in data loader then?但是,如果我在数据加载器中设置 shuffle = False 那么如何在每个 epoch 之后调整训练批次?

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM