简体   繁体   中英

How to feed output of keras LSTM layer into input layer?

I am fairly new to tensorflow and keras and have a question. I want to use do time series prediction using LSTM layer, with some modifications. I started with the example given in the tensorflow tutorial

def build_LSTM(neurons, batch_size, history_size, features):
   model = tf.keras.models.Sequential()
   model.add(tf.keras.layers.LSTM(neurons, 
                                  batch_input_shape=(batch_size, history_size, features),
                                  stateful=True))
   model.add(tf.keras.layers.Dense(1))
   model.compile(loss='mean_squared_error', optimizer='adam')
   return(model)

In the current state from the example, the input of for the model is of the form (observations, time steps, features), and it returns a single number (the prediction for the next time step).

What I want to do is use the mode return_sequence=True in the LSTM layer.

Is it correct that this returns a tensor T of shape (time steps, features)?

Is there a way to transfer this tensor from one step (lets say observation = 1) to the next step (observation = 2)? I guess the corresponding graph would look like this:

在此处输入图片说明

To answer your queation, Is it correct that this returns a tensor T of shape (time steps, features)?

Answer is yes the output is a tensor of an output for each time steps.

Another question, Is there a way to transfer this tensor from one step (lets say observation = 1) to the next step (observation = 2)?

This question is quite hard to answer, technically when you specify return_sequence=True, it automatically compute each timestep and feed "current state" to it self as an initial state when it compute next time step, until it compute all your data and give that Tensor output that you ask in question 1. So, if you want this tensor for further computing, such as you want to sum up all answer from odd time steps, it is possible. Moreover, If you want to pass your last state to next batch of input, you can achieve that by giving stateful=True argument.

However, If you want to feed an output of last time step to current time step (something like close-loop control), regardless of given model, you need to create your own recurrent cell and use it with RNN layer custom_model = RNN(custom_recurrent _cell, return_sequence=True) .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM