简体   繁体   English

如何在kers中的LSTM中在LSTM中增加辍学和注意力

[英]How to add dropout and attention in LSTM in kers in python

I have about 1000 nodes dataset where each node has 4 time-series. 我有大约1000个节点数据集,其中每个节点都有4个时间序列。 Each time series is exactly 6 length long.The label is 0 or 1 (ie binary classification). 每个时间序列正好是6个长度。标签是0或1(即二进制分类)。

More precisely my dataset looks as follows. 更准确地说,我的数据集如下所示。

node, time-series1, time_series2, time_series_3, time_series4, Label
n1, [1.2, 2.5, 3.7, 4.2, 5.6, 8.8], [6.2, 5.5, 4.7, 3.2, 2.6, 1.8], …, 1
n2, [5.2, 4.5, 3.7, 2.2, 1.6, 0.8], [8.2, 7.5, 6.7, 5.2, 4.6, 1.8], …, 0
and so on.

I normalise my timeseries before I feed it into my LSTM model for classification. normalise时间序列输入LSTM模型进行分类之前,我先对其进行归一化。

model = Sequential()
model.add(LSTM(10, input_shape=(6,4)))
model.add(Dense(32))
model.add(Dense(1, activation='sigmoid'))
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

print(data.shape) # (1000, 6, 4)
model.fit(data, target)

I am new to keras and that is why started with the simplest LSTM model. 我是keras的新手,这就是为什么从最简单的LSTM模型开始。 However, now I would like to make it into a level that I can use it at an industry level. 但是,现在我想使其成为一个可以在行业级别使用的级别。

I read that it is good to add dropout and attention layers to the LSTM models. 我读到,在LSTM模型中添加dropoutattention层是很好的。 Please let me know if you think that adding such layers is applicable to my problem and if so how to do it? 如果您认为添加这样的图层适用于我的问题,请告诉我,如果可以,该怎么办? :) :)

Note: I am not limited to droupout and attention layers and happy to receive other suggestions that I can use to improve my model. 注意:我不仅仅局限于讲解和关注层,并且很高兴收到其他可用于改进模型的建议。

I am happy to provide more details if needed. 如果需要,我很乐意提供更多详细信息。

if you want to add dropout in lstm cell, you can try this 如果要在lstm单元中添加辍学,可以尝试此操作

model = Sequential()
model.add(LSTM(10, input_shape=(6,4), dropout=0.5))
model.add(Dense(32))
model.add(Dense(1, activation='sigmoid'))
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

print(data.shape) # (1000, 6, 4)
model.fit(data, target)

or using dropout between lstm cell, may consider below 或在lstm单元之间使用dropout,请考虑以下

model = Sequential()
model.add(LSTM(10, input_shape=(6,4)))
model.add(Dropout(0.5))
model.add(LSTM(10, input_shape=(6,4)))
model.add(Dense(32))
model.add(Dense(1, activation='sigmoid'))
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

print(data.shape) # (1000, 6, 4)
model.fit(data, target)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM