[英]Output tensors to a Model must be the output of a Keras `Layer` (thus holding past layer metadata). When using functional api for CNN LSTM
I am trying to do a simple cnn-lstm classification with time distributed but I am getting the following error: Output tensors to a Model must be the output of a Keras Layer
(thus holding past layer metadata).我正在尝试使用时间分布进行简单的 cnn-lstm 分类,但出现以下错误:模型的输出张量必须是 Keras
Layer
的输出(因此保存过去的层元数据)。 Found:成立:
my samples are grayscaled images of 366 channels and 5x5 size each sample has its own unique label.我的样本是 366 个通道和 5x5 大小的灰度图像,每个样本都有自己独特的标签。
model_input = Input(shape=(366,5,5))
model = TimeDistributed(Conv2D(64, (3, 3), activation='relu', padding='same',data_format='channels_first')(model_input))
model = TimeDistributed(MaxPooling2D((2, 2),padding='same',data_format='channels_first'))
model = TimeDistributed(Conv2D(128, (3,3), activation='relu',padding='same',data_format='channels_first'))
model = TimeDistributed(MaxPooling2D((2, 2), strides=(2, 2),padding='same',data_format='channels_first'))
model = Flatten()
model = LSTM(256, return_sequences=False, dropout=0.5)
model = Dense(128, activation='relu')
model = Dense(6, activation='softmax')
cnnlstm = Model(model_input, model)
cnnlstm.compile(optimizer='adamax',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
cnnlstm.summary()
You have to pass tensors between the layers as this is how the Functional API works, for all layers, using the Layer(params...)(input)
notation:您必须在层之间传递张量,因为这是 Functional API 的工作方式,对于所有层,使用
Layer(params...)(input)
表示法:
model_input = Input(shape=(366,5,5))
model = TimeDistributed(Conv2D(64, (3, 3), activation='relu', padding='same',data_format='channels_first'))(model_input)
model = TimeDistributed(MaxPooling2D((2, 2),padding='same',data_format='channels_first'))(model)
model = TimeDistributed(Conv2D(128, (3,3), activation='relu',padding='same',data_format='channels_first'))(model)
model = TimeDistributed(MaxPooling2D((2, 2), strides=(2, 2),padding='same',data_format='channels_first'))(model)
model = TimeDistributed(Flatten())(model)
model = LSTM(256, return_sequences=False, dropout=0.5)(model)
model = Dense(128, activation='relu')(model)
model = Dense(6, activation='softmax')(model)
cnnlstm = Model(model_input, model)
Note that I have also corrected the first TimeDistributed
layer, as the tensor was in the wrong part.请注意,我还更正了第一个
TimeDistributed
层,因为张量位于错误的部分。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.