简体   繁体   English

Python-预处理大量图像数据-循环冻结

[英]Python - Preprocessing a lot of image data - Loop freezes

I have the following code in my Python Script: 我的Python脚本中包含以下代码:

def process_data(data):
    features = []

    for i in range(int(len(data))): # Should iterate 33'000 times (length of data)
        for j in range(3):
            img = plt.imread(data[i][j].strip()) # Data is from a csv file
            img = normalize(img)  # Simple calculation

            lis = img.flatten().tolist()
            features += lis

return features

This should run for about 33'000*3 times to preprocess the whole data. 这应该运行约33'000 * 3次以预处理整个数据。 However, after around 10'000-12'000 iterations, the script slows down radically, freezes and sometimes my machine (i7 - 3.6GHz, 8GB RAM) freezes as well. 但是,经过大约10,000至12'000次迭代后,脚本会急剧减速,冻结,有时我的机器(i7-3.6GHz,8GB RAM)也会冻结。

What can I do? 我能做什么? It's difficult to split up the data. 拆分数据很困难。 I was told to use the Keras' fit_generator before but how would I do that? 有人告诉我以前使用fit_generator ,但是我该怎么做?

Depending on how large those image files are, you could be running out of RAM and being forced to swap. 根据这些图像文件的大小,您可能会用完RAM并被迫交换。 That would cause your system to be sluggish. 这将导致您的系统呆滞。 Instead of gathering all the processed files into list, can you write them out one at a time? 除了将所有已处理的文件收集到列表中之外,您可以一次将它们写出一个吗?

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM