简体   繁体   English

如何在训练 model 时修复 Memory 错误?

[英]How to fix Memory Error while training model?

I've been working on a Neural Network recently but everytime I try to compile the model, I get a SIGKILL which by looking at Activity Monitor, is from a memory error.我最近一直在研究神经网络,但每次我尝试编译 model 时,我都会得到一个 SIGKILL,通过查看活动监视器,它来自 memory 错误。 My data is very large but it's not a part of the problem because I tried taking a tiny part of it but I still get the same error.我的数据非常大,但这不是问题的一部分,因为我尝试提取其中的一小部分,但仍然遇到相同的错误。 This is the code I'm using:这是我正在使用的代码:

f = gzip.GzipFile('Data_x.npy.gz', "r")
datax = np.load(f)[:5, :, :]
f.close()
f = gzip.GzipFile('Data_y.npy.gz', "r")
datay = np.load(f)[:5, :, :]

f.close()
f = None
model = Sequential(
    [
        #Conv1D(32, 3, input_shape=datax.shape, activation="relu"),
        Flatten(input_shape=datax.shape),
        Dense(750, activation='relu'),
        Dense(750, activation='relu'),
        Dense(2, activation='sigmoid')
    ]
)
model.compile(optimizer=Adam(learning_rate=0.1), loss="binary_crossentropy", metrics=['accuracy'])
model1 = model.fit(x=datax, y=datay, batch_size=5, epochs=5, shuffle=True, verbose=2)

I've tried many different structures for the model and different batch sizes/epochs but I still get this error.我已经为 model 和不同的批量大小/时期尝试了许多不同的结构,但我仍然得到这个错误。 Any help in this matter would be greatly appreciated.在这件事上的任何帮助将不胜感激。

you add dropout layer in your model.您在 model 中添加 dropout 层。

Dropout is a technique where randomly selected neurons are ignored during training. Dropout 是一种在训练过程中忽略随机选择的神经元的技术。 They are “dropped-out” randomly.他们是随机“辍学”的。 This means that their contribution to the activation of downstream neurons is temporally removed on the forward pass and any weight updates are not applied to the neuron on the backward pass.这意味着它们对下游神经元激活的贡献在正向传递中被暂时移除,并且任何权重更新都不会应用于反向传递中的神经元。

model = Sequential(
    [
        #Conv1D(32, 3, input_shape=datax.shape, activation="relu"),
        Flatten(input_shape=datax.shape),
        Dense(750, activation='relu'),
        Dropout(0.2),
        Dense(750, activation='relu'),
        Dropout(0.2),
        Dense(2, activation='sigmoid')
    ]

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM