简体   繁体   English

用于温度时间序列预测的 LSTM 神经网络

[英]LSTM Neural Network for temperature time series predictions

I'm learning to work with neural.networks applied to time-series so I tuned and LSTM example that I found to make predictions of daily temperature data.我正在学习使用应用于时间序列的 neural.networks,所以我调整了 LSTM 示例,我发现它可以预测每日温度数据。 However, I found that the results are extremely poor as is shown in the image.但是,我发现结果非常差,如图所示。 (I only predict the last 92 days in order to save time for now). (我只预测最近 92 天,以便暂时节省时间)。

LSTM日最低气温预测

This is the code I implemented.这是我实现的代码。 The data are 3 column dataframe (minimum, maximum and mean daily temperatures), but I only employ one of the columns at one time.数据是 3 列 dataframe(最低、最高和平均每日温度),但我一次只使用其中一列。

import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from statsmodels.tools.eval_measures import rmse
from sklearn.preprocessing import MinMaxScaler
from keras.preprocessing.sequence import TimeseriesGenerator
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import LSTM
from keras.layers import Dropout
import warnings
warnings.filterwarnings("ignore")

input_file2 = "TemperaturasCampillos.txt"
seriesT = pd.read_csv(input_file2,sep = "\t", decimal = ".", names = ["Minimas","Maximas","Medias"])
seriesT[seriesT==-999]=np.nan

date1 = '2010-01-01'
date2 = '2010-09-01'
date3 = '2020-05-17'
date4 = '2020-12-31'
mydates = pd.date_range(date2, date3).tolist()
seriesT['Fecha'] = mydates
seriesT.set_index('Fecha',inplace=True)  # Para que los índices sean fechas y así se ponen en el eje x de forma predeterminada
seriesT.index = seriesT.index.to_pydatetime()

df =  seriesT.drop(seriesT.columns[[1, 2]], axis=1)  # df.columns is zero-based pd.Index
n_input = 92
train, test = df[:-n_input], df[-n_input:]

scaler = MinMaxScaler()
scaler.fit(train)
train = scaler.transform(train)
test = scaler.transform(test)


#n_input = 365
n_features = 1
generator = TimeseriesGenerator(train, train, length=n_input, batch_size=1)
model = Sequential()
model.add(LSTM(200, activation='relu', input_shape=(n_input, n_features)))
model.add(Dropout(0.15))
model.add(Dense(1))
model.compile(optimizer='adam', loss='mse')
model.fit_generator(generator,epochs=150)

#create an empty list for each of our 12 predictions
#create the batch that our model will predict off of
#save the prediction to our list
#add the prediction to the end of the batch to be used in the next prediction

pred_list = []

batch = train[-n_input:].reshape((1, n_input, n_features))

for i in range(n_input):   
    pred_list.append(model.predict(batch)[0]) 
    batch = np.append(batch[:,1:,:],[[pred_list[i]]],axis=1)

df_predict = pd.DataFrame(scaler.inverse_transform(pred_list),                           
                          index=df[-n_input:].index, columns=['Prediction'])
df_test = pd.concat([df,df_predict], axis=1)

plt.figure(figsize=(20, 5))
plt.plot(df_test.index, df_test['Minimas'])
plt.plot(df_test.index, df_test['Prediction'], color='r')
plt.legend(loc='best', fontsize='xx-large')
plt.xticks(fontsize=18)
plt.yticks(fontsize=16)
plt.show()

As you can see if you click in the image link, I get a predict too smoothed, good to see the seasonality but is not what I am looking forward.如您所见,如果您单击图像链接,我得到的预测过于平滑,很高兴看到季节性,但这不是我所期待的。 In addition, I tried to add more layers to the neural.network shown, so the.network looks something like:此外,我尝试向所示的 neural.network 添加更多层,因此 .network 看起来像这样:

#n_input = 365
n_features = 1
generator = TimeseriesGenerator(train, train, length=n_input, batch_size=1)
model = Sequential()
model.add(LSTM(200, activation='relu', input_shape=(n_input, n_features)))
model.add(LSTM(128, activation='relu'))
model.add(LSTM(256, activation='relu'))
model.add(LSTM(128, activation='relu'))
model.add(LSTM(64, activation='relu'))
model.add(LSTM(n_features, activation='relu'))
model.add(Dropout(0.15))
model.add(Dense(1))
model.compile(optimizer='adam', loss='mse')
model.fit_generator(generator,epochs=100)

but I get this error:但我收到此错误:

ValueError : Input 0 is incompatible with layer lstm_86: expected ndim=3, found ndim=2 ValueError :输入 0 与层 lstm_86 不兼容:预期 ndim=3,发现 ndim=2

Of course, as the model has a bad performance I cannot assure that out-of-sample predictions would be accurate.当然,由于 model 的性能很差,我不能保证样本外预测是准确的。 Why I cannot implement more layers to the.network?为什么我不能为 the.network 实现更多层? How could I improve the performance?我怎样才能提高性能?

You are missing one argument: return_sequences.您缺少一个参数:return_sequences。

When you have more than one LSTM layer, you should set it to TRUE.当你有多个 LSTM 层时,你应该将它设置为 TRUE。 Because, otherwise, that layer will only output the last hidden state. Add it to each LSTM layer.因为,否则,那个层将只有 output 最后一个隐藏的 state。将它添加到每个 LSTM 层。

model.add(LSTM(128, activation='relu', return_sequences=True))

About poor performance: my guess it is because of you have low amount of data for this application (the data seems pretty noise), add layers won't help too much.关于性能不佳:我猜这是因为你的这个应用程序的数据量很少(数据看起来很嘈杂),添加层不会有太大帮助。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM