[英]Keras with tensorflow backend---MemoryError
I am trying to follow this tutorial to learn a bit about deep learning with keras, however I keep getting MemoryError. 我试图按照本教程学习一些有关keras的深度学习的知识,但是我不断收到MemoryError。 Can you please point out what is causing it and how to take care of it?
您能否指出是什么原因造成的,以及如何处理?
Here is the code: 这是代码:
import numpy as np
from keras import models, regularizers, layers
from keras.datasets import imdb
(train_data, train_labels), (test_data, test_labels) = imdb.load_data(num_words=10000)
def vectorize_sequences(sequences, dimension=10000):
results = np.zeros((len(sequences), dimension))
for i, sequence in enumerate(sequences):
results[i, sequence] = 1.
return results
x_train = vectorize_sequences(train_data)
Here is the traceback (line number doesn't match the line number from the code mentioned above) 这是回溯(行号与上述代码中的行号不匹配)
Traceback (most recent call last):
File "<input>", line 1, in <module>
File "/home/uttam/pycharm-2018.2.4/helpers/pydev/_pydev_bundle/pydev_umd.py", line 197, in runfile
pydev_imports.execfile(filename, global_vars, local_vars) # execute the script
File "/home/uttam/pycharm-2018.2.4/helpers/pydev/_pydev_imps/_pydev_execfile.py", line 18, in execfile
exec(compile(contents+"\n", file, 'exec'), glob, loc)
File "/home/uttam/PycharmProjects/IMDB/imdb.py", line 33, in <module>
x_train = vectorize_sequences(train_data)
File "/home/uttam/PycharmProjects/IMDB/imdb.py", line 27, in vectorize_sequences
results = np.zeros((len(sequences), dimension))
MemoryError
Yes, you are correct. 是的,你是对的。 The problem does arise from
vectorize_sequences
. 这个问题确实是由
vectorize_sequences
引起的。
You should do that logic in batches (with slicing data like for partial_x_train
) or use generators ( here is a good explanation and example). 您应该分批执行该逻辑(使用诸如
partial_x_train
类的切片数据)或使用生成器( 此处是一个很好的说明和示例)。
I hope this helps :) 我希望这有帮助 :)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.