简体   繁体   中英

Memory error when using Keras fit_generator and datagen.flow

I am trying to use datagen.flow with the ImageGenerator class in Keras. I get the following memory error:

Traceback (most recent call last):

File "scratch_6.py", line 284, in <module>

history = model.fit_generator(datagen.flow(train_X, train_y, 
batch_size=batch_size, save_to_dir='test_RA', save_format='png'),

File "/usr/local/lib/python3.5/dist-
packages/keras/preprocessing/image.py", line 455, in flow 
save_format=save_format)

File "/usr/local/lib/python3.5/dist-
packages/keras/preprocessing/image.py", line 764, in __init__
self.x = np.asarray(x, dtype=K.floatx())

File "/usr/local/lib/python3.5/dist-packages/numpy/core/numeric.py", line 531, in asarray
return array(a, dtype, copy=False, order=order)

MemoryError

I have 128GB of RAM available. I have tried reducing the batch size, but no change. Any help appreciated. Thank you.

This is a common problem for all the deep learning algorithm where dataset is quite huge in size. So for this type of problem all the data we cannot load into the RAM because for computing and also for saving the model RAM memory needs loot of space.Also for when we are converting the input data from int type to floot, it will gonna take 4 times space of the input image. so solution of this problem is to preprocessed the image, as well as complete the data augmentation and save the entire data into a hdf5 database and store into your hard disc, and time of fetching the data load batch by batch and train the model, it might be take long time but it wont consume the memory completely.

Thanks Kunal

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM