I want to load mnist data using
test_dataset = datasets.MNIST(root='./mnist_data/', train=False, transform=transforms.ToTensor(), download=False)
test_loader = torch.utils.data.DataLoader(dataset=test_dataset, batch_size=1, shuffle=False)
However this code will load all 10000 test examples of hand written datapoints. Is there any opportunity to have test_loader exactly in the same type and to narrow its loading only to 100?
I tried intuitively to do:
test_loader.dataset = test_loader.dataset[0:99]
But I obtained error:
ValueError: only one element tensors can be converted to Python scalars
Because python doesn't understand object test_loader.dataset[0:99]
.
Could you please help me solving this issue?
You may approach to this a bit differently. For example use a loop to iterate 100 times and in each iteration you can get a sample from a dataset. It may look like this:
test_dataset = datasets.MNIST(root='./mnist_data/', train=False, transform=transforms.ToTensor(), download=False)
# make it iterative object
test_dataset_iter = iter(test_dataset)
# then you can get a sample by `next` function
for _ in range(100):
image, label = next(test_dataset_iter)
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.