简体   繁体   中英

keras LSTM poor prediction

i am trying to predict the value based on some sequence (i have 5 values like 1,2,3,4,5 and want preditc the next one - 6). I am using LSTM keras for that.

creating training data:

import numpy as np 
from keras.models import Sequential
from keras.layers import LSTM,Dense
a = [float(i) for i in range(1,100)]
a = np.array(a)

data_train = a[:int(len(a)*0.9)]
data_test = a[int(len(a)*0.9):]

x = 5
y = 1
z = 0

train_x = []
train_y = []
for i in data_train:
    t = data_train[z:x]
    r = data_train[x:x+y]
    if len(r) == 0:
        break
    else:
        train_x.append(t)
        train_y.append(r)
        z = z + 1
        x = x+1

train_x = np.array(train_x)
train_y = np.array(train_y)

x = 5
y = 1
z = 0

test_x = []
test_y = []
for i in data_test:
    t = data_test[z:x]
    r = data_test[x:x+y]
    if len(r) == 0:
        break
    else:
        test_x.append(t)
        test_y.append(r)
        z = z + 1
        x = x+1

test_x = np.array(test_x)
test_y = np.array(test_y)

print(train_x.shape,train_y.shape)
print(test_x.shape,test_y.shape)

transform it into LSTM freandly shape:

train_x_1 = train_x.reshape(train_x.shape[0],len(train_x[0]),1)
train_y_1 = train_y.reshape(train_y.shape[0],1)
test_x_1 = test_x.reshape(test_x.shape[0],len(test_x[0]),1)
test_y_1 = test_y.reshape(test_y.shape[0],1)


print(train_x_1.shape, train_y_1.shape)
print(test_x_1.shape, test_y_1.shape)

build and train model:

model = Sequential()
model.add(LSTM(32,return_sequences = False,input_shape=(trein_x_1.shape[1],1)))
model.add(Dense(1))

model.compile(loss='mse',  optimizer='adam', metrics=['accuracy'])
history = model.fit(train_x_1,
                    train_y_1,
                    epochs=20,
                    shuffle=False, 
                    batch_size=1, 
                    verbose=2, 
                    validation_data=(test_x_1,test_y_1))

but I get a realy bad result, can somebody explaine me what I am doing wrong.

pred = model.predict(test_x_1)
for i,a in enumerate(pred):
    print(pred[i],test_y_1[i])
[89.71895] [95.]
[89.87877] [96.]
[90.03465] [97.]
[90.18714] [98.]
[90.337006] [99.]

Thenks.

You expect the network to extrapolate from the data you used for training. Neural networks are not good at this . You could try to normalize your data so that you are not extrapolating anymore by, for example, using relative values instead of absolute values. That would make this example of course very trivial.

Please use first derivation for x (always delta x == 1 ), then it works. But this makes of course no real sense for this simple formular. Try something more complicated like my example:

LSTM prediction of damped sin curve

Source:

https://www.kaggle.com/maciejbednarz/lstm-example

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM