简体   繁体   中英

MemoryError in numpy normalization

I am working on a large dataset consisting images.

When I run the following code:

data=[]
def image_to_feature_vector(image, size=(128, 128)):
   return cv2.resize(image, size).flatten()

for i in range(0,len(imagePath)):
    image = cv2.imread(imagePath[i])
    features = image_to_feature_vector(image)
    data.append(features)

data = np.array(data) / 255.0

I got an error as:

np.array(data) / 255.0

MemoryError

How to fix this? Thanks in advance!!!

Some easy memory saving strategies include

1 preallocate data and avoid creating a temporary list

data = np.empty((len(imagePath),) + features_shape)
for i, slc in enumerate(data):
     ...
     slc[...] = features

2 use in-place operations where possible

data /= 255.0

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM