简体   繁体   中英

Keras LSTM Input 0 of layer sequential_10 is incompatible with the layer

My code for the LSTM is as follows:

def myLSTM(i_shape, o_shape):
    input = keras.layers.Input(i_shape)
    model = Sequential()
    x = keras.layers.LSTM(128, return_sequences = True, input_shape = (x_train.shape[1], 1))(input)
    x = keras.layers.Dropout(0.2)(x)
    x = keras.layers.LSTM(128, return_sequences = True)(x)
    x = keras.layers.Dropout(0.2)(x)
    x = keras.layers.LSTM(64, return_sequences = True)(x)
    x = keras.layers.Dropout(0.2)(x)
    output = layers.Dense(units = 1, activation='softmax')(x)
    return Model(input, output)

my_lstm = myLSTM(x_train.shape[1:], y_train.shape[1:])
my_lstm.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['acc'])
my_lstm.summary()

I am getting the following error:

ValueError: Input 0 of layer lstm is incompatible with the layer: expected ndim=3, found ndim=2. Full shape received: (None, 20)

This error confuses me because I feel like a 3-dimensional shape is passed into the LSTM but it shows that a 2-dimensional shape is detected.

The dimensions of my data are as follows: x_train shape is (207, 20), y_train shape is (207, 5), x_test shape is (24, 20), y_test shape is (24, 5),

I'm also running this LSTM for a classification use case, as you can see in my code.

As @Andrey mention that, LSTM expects to have a 3D shape data [batch_size, time_steps, feature_size]

Example,If we provide for each of the 32 batch samples, for each of the 10 time steps, a 8 dimensional vector: Input data shape should be something like,

X_train = tf.random.normal([32, 10, 8])

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM