简体   繁体   中英

Confusion regarding batch size while using DataLoader in pytorch

I am new to pytorch. I am training an ANN for classification on the MNIST dataset.

train_loader = DataLoader(train_data,batch_size=200,shuffle=True)

I am confused. The dataset is of 60,000 images and I have set batch size of 6000 and my model has 30 epochs. Will every epoch see only 6000 images or will every epoch see 10 batches of 6000 images?

Every call to the dataset iterator will return batch of images of size batch_size . Hence you will have 10 batches until you exhaust all the 60000 images.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM