简体   繁体   中英

Change CNN to LSTM keras tensorflow

I have a CNN and like to change this to a LSTM, but when I modified my code I receive the same error: ValueError: Input 0 is incompatible with layer gru_1: expected ndim=3, found ndim=4

I already change ndim but didn't work.

follow my cnn

def build_model(X,Y,nb_classes):
    nb_filters = 32  # number of convolutional filters to use
    pool_size = (2, 2)  # size of pooling area for max pooling
    kernel_size = (3, 3)  # convolution kernel size
    nb_layers = 4
    input_shape = (1, X.shape[2], X.shape[3])

    model = Sequential()
    model.add(Convolution2D(nb_filters, kernel_size[0], kernel_size[1],
                        border_mode='valid', input_shape=input_shape))

    model.add(BatchNormalization(axis=1))
    model.add(Activation('relu'))

    for layer in range(nb_layers-1):
        model.add(Convolution2D(nb_filters, kernel_size[0], kernel_size[1]))
        model.add(BatchNormalization(axis=1))
        model.add(ELU(alpha=1.0))  
        model.add(MaxPooling2D(pool_size=pool_size))
        model.add(Dropout(0.25))

    model.add(Flatten())
    model.add(Dense(128))
    model.add(Activation('relu'))
    model.add(Dropout(0.5))
    model.add(Dense(nb_classes))
    model.add(Activation("softmax"))
    return model

and follow how i like to did my LSTM

data_dim = 41
timesteps = 20
num_classes = 10

model = Sequential()

model.add(LSTM(256, return_sequences=True, input_shape=(timesteps, data_dim)))  
model.add(Dropout(0.5))

model.add(LSTM(128, return_sequences=True, input_shape=(timesteps, data_dim)))  
model.add(Dropout(0.25))

model.add(LSTM(64))  
model.add(Dropout(0.2))

model.add(Dense(num_classes, activation='softmax'))

What I was doing wrong? Thanks

The LSTM code is fine, it executes with no errors for me. The error you are seeing is related to internal incompatibility of the tensors within the model itself, not related to training data, in which case you'll get an "Exception: Invalid input shape"

What's confusing in your error is that it refers to a GRU layer, which isn't contained anywhere in your model definition. If your model only contains LSTM, you should get an error that calls out the LSTM layer that it conflicts with.

Perhaps check

model.get_config()

and make sure all the layers and configs are what you intended. In particular, the first layer should say this:

batch_input_shape': (None, 20, 41)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM