简体   繁体   中英

Keras Lambda layer throws ndim error in functional API, but not in Sequential

I'm trying to get the output from an LSTM layer per time step, and at the last time step only (step output and the context vector) separately, so I found that the solution to do that is to make a lambda layer that extracts the context vector from the LSTM with return_sequences=True . In the Sequential model it worked fine, but when I'm trying to implement it in the functional API it is suddenly not accepting the dimensions anymore, stating that everything is of ndim=1 even though it is not. code:

def ContextVector(x):
    return x[-1][-1]
def ContextVectorOut(input_shape):
    print([None, input_shape[-1]])
    print((input_shape[::2]))
    print(input_shape)
    return list((None, input_shape[-1]))

input_layer = Input(shape=(10, 5))
LSTM_layer = LSTM(5, return_sequences=True)(input_layer)
context_layer = Lambda(ContextVector, output_shape=ContextVectorOut)(LSTM_layer)
repeat_context_layer = RepeatVector(10, name='context')(context_layer)
timed_dense = TimeDistributed(Dense(10))(LSTM_layer)
connected_dense = Dense(2)
connect_dense_context = connected_dense(repeat_context_layer)
connect_dense_time = connected_dense(timed_dense)
concat_out = concatenate([connect_dense_context, connect_dense_time])
output_dense = Dense(5)(concat_out)
model = Model(inputs = [input_layer], output = output_dense)

#model.add(LSTM(20, input_shape = (10, 5), return_sequences=True))
#model.add(Lambda(ContextVector, output_shape=ContextVectorOut))
#model.add(Dense(1))

model.summary()

Error:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-74-016b4a976d40> in <module>()
     10 LSTM_layer = LSTM(5, return_sequences=True)(input_layer)
     11 context_layer = Lambda(ContextVector, output_shape=ContextVectorOut)(LSTM_layer)
---> 12 repeat_context_layer = RepeatVector(10, name='context')(context_layer)
     13 timed_dense = TimeDistributed(Dense(10))(LSTM_layer)
     14 connected_dense = Dense(2)

C:\ProgramData\Miniconda3\lib\site-packages\keras\engine\base_layer.py in __call__(self, inputs, **kwargs)
    412                 # Raise exceptions in case the input is not compatible
    413                 # with the input_spec specified in the layer constructor.
--> 414                 self.assert_input_compatibility(inputs)
    415 
    416                 # Collect input shapes to build layer.

C:\ProgramData\Miniconda3\lib\site-packages\keras\engine\base_layer.py in assert_input_compatibility(self, inputs)
    309                                      self.name + ': expected ndim=' +
    310                                      str(spec.ndim) + ', found ndim=' +
--> 311                                      str(K.ndim(x)))
    312             if spec.max_ndim is not None:
    313                 ndim = K.ndim(x)

ValueError: Input 0 is incompatible with layer context: expected ndim=2, found ndim=1

I found my mistake. I was returning x[-1][-1] , where I should've returned x[-1] only. The ndim error is from the Lambda layer, not the previous layer.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM