简体   繁体   English

PyTorch数据集中的len函数在哪里使用?

[英]Where is the len function used in PyTorch Dataset?

I am looking to use the code from here . 我希望从这里使用代码。 However, I am looking at box 5, where there is the following function; 但是,我在方框5中看到以下功能;

def __len__(self):
    # Default epoch size is 10 000 samples
    return 10000

I do not see anywhere throughout this script where this function is being used. 我在整个脚本中都没有看到使用此功能的地方。 Clarification on this would be appreciated. 对此的澄清将不胜感激。

Also, I want to determine the number of image patches used for training this convolutional neural network. 另外,我想确定用于训练该卷积神经网络的图像补丁数量。 Is this len function linked to the number of patches? len函数是否与色标数量相关联?

This is a function of the Dataset class. 这是Dataset类的功能。 The __len__() function specifies the size of the dataset. __len__()函数指定数据集的大小。 In your referenced code, in box 10, a dataset is initialized and passed to a DataLoader object: 在您参考的代码中,在方框10中,数据集被初始化并传递给DataLoader对象:

train_set = ISPRS_dataset(train_ids, cache=CACHE)
train_loader = torch.utils.data.DataLoader(train_set,batch_size=BATCH_SIZE)

You see that in the DataLoader the dataset object is passed as well as the batch size. 您会看到在DataLoader中传递了数据集对象以及批处理大小。 The DataLoader object then uses the __len__ function of the Dataset to create the batches. 然后,DataLoader对象使用数据集的__len__函数创建批处理。 This happens in box 13, where it is iterated over the DataLoader. 这在框13中发生,在框13上通过DataLoader进行迭代。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM