简体   繁体   English

Keras LSTM用于连续输入和连续输出

[英]Keras LSTM for continuous input and continuous output

For example I have binary data, let say: 0, 0, 0, 1, 1, 0, 1, 1. This may continue indefinitely. 例如,我有二进制数据,可以说:0、0、0、1、1、0、1、1。这可能会无限期地继续。 For each input, there is corresponding output. 对于每个输入,都有相应的输出。 Let say we use XOR operation. 假设我们使用XOR操作。 So, output may look like this: 0, 0, 0, 1, 0, 1, 1, 0. 因此,输出可能如下所示:0、0、0、1、0、1、1、0。

How do I shape Keras input shape? 如何塑造Keras输入形状? How do I set timesteps? 如何设置时间步长? If I declare timesteps 1 are for each 1 timestep considered different case or it can still take account of previous input as sequence or learned memory? 如果我声明时间步长1是每1个时间步长被认为是不同的情况,还是仍可以考虑先前的输入作为顺序或学习的内存?

Keras is using LSTM or GRU for it's hidden layer. Keras的隐藏层使用LSTM或GRU。

I've tried 2 method for this problem, but none seem succeed. 我已经尝试了2种方法来解决此问题,但似乎都没有成功。 Both method stuck at 37.5 acc. 两种方法都停留在37.5 acc。 In fact, it keep guessing 1. 实际上,它一直在猜测1。

Method 1: 方法1:

data = [[[0], [0], [0], [1], [1], [0], [1], [1]]]
output = [[[0], [0], [0], [1], [0], [1], [1], [0]]]

model = Sequential()
model.add(GRU(10, input_shape=(8, 1), return_sequences=True))
model.add(GRU(10, return_sequences=True))
model.add(Dense(1, activation='softmax'))

model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['acc'])
model.fit(np.asarray(data), np.asarray(output), epochs=3000)

Method 2: 方法2:

data = [[[0]], [[0]], [[0]], [[1]], [[1]], [[0]], [[1]], [[1]]]
output = [[0], [0], [0], [1], [0], [1], [1], [0]]

model = Sequential()
model.add(GRU(10, input_shape=(1, 1), return_sequences=True))
model.add(GRU(10))
model.add(Dense(1, activation='softmax'))

model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['acc'])
model.fit(np.asarray(data), np.asarray(output), epochs=300)

In fact, it keep guessing 1. 实际上,它一直在猜测1。

That's because you have used softmax as the activation of last layer. 那是因为您使用softmax作为最后一层的激活。 Since the last layer have only one unit and the softmax function normalizes its input such that the sum of elements equals to one, it would always outputs 1. Instead, you need to use sigmoid as the activation function of last layer to have an output between zero and one. 由于最后一层只有一个单位,并且softmax函数将其输入归一化,以使元素之和等于1,因此它将始终输出1。相反,您需要使用sigmoid作为最后一层的激活函数,以使输出之间零和一。

activation='softmax'更改为activation='sigmoid'

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM