简体   繁体   中英

Merge 6 inputs in Conv1D keras

I have written a structure for Conv1D in keras. I want to merge the 6 different inputs of same shape. Previously, Merge([ model1, model2, model3, model4, model5, model6], mode = 'concat') worked just fine but after new updates, I cant use Merge anymore.

Concatenate can be used as follows,

from keras.layers import Concatenate model = Concatenate([ model1, model2, model3, model4, model5, model6])

But I want to add Dense layers before the softmax layer to this merged model, which I cant add to Concatenate as it accepts only tensor inputs.

How do I merge the 6 inputs before passing it to 2 dense layers and softmax layer??

My current code is as follows,

input_shape = (64,250)

model1 = Sequential()
model1.add(Conv1D(64, 2, activation='relu', input_shape=input_shape))
model1.add(Conv1D(64, 2, activation='relu'))
model1.add(MaxPooling1D(2))
model1.add(Dropout(0.75))
model1.add(Flatten())

model2 = Sequential()
model2.add(Conv1D(128, 2, activation='relu', input_shape=input_shape))
model2.add(Conv1D(128, 2, activation='relu'))
model2.add(MaxPooling1D(2))
model2.add(Dropout(0.75))
model2.add(Flatten())

model3 = Sequential()
model3.add(Conv1D(128, 2, activation='relu', input_shape=input_shape))
model3.add(Conv1D(128, 2, activation='relu'))
model3.add(MaxPooling1D(2))
model3.add(Dropout(0.75))
model3.add(Flatten())

model4 = Sequential()
model4.add(Conv1D(128, 2, activation='relu', input_shape=input_shape))
model4.add(Conv1D(128, 2, activation='relu'))
model4.add(MaxPooling1D(2))
model4.add(Dropout(0.75))
model4.add(Flatten())

model5 = Sequential()
model5.add(Conv1D(128, 2, activation='relu', input_shape=input_shape))
model5.add(Conv1D(128, 2, activation='relu'))
model5.add(MaxPooling1D(2))
model5.add(Dropout(0.75))
model5.add(Flatten())

model6 = Sequential()
model6.add(Conv1D(128, 2, activation='relu', input_shape=input_shape))
model6.add(Conv1D(128, 2, activation='relu'))
model6.add(MaxPooling1D(2))
model6.add(Dropout(0.75))
model6.add(Flatten())

from keras.layers import Concatenate
model = Concatenate([ model1, model2, model3, model4, model5, model6])
model.add(Dense(512, activation='relu'))
model.add(Dropout(0.75))
model.add(Dense(1024, activation='relu'))
model.add(Dropout(0.75))

model.add(Dense(40, activation='softmax'))
opt = keras.optimizers.adam(lr=0.001, decay=1e-6)
model.compile(loss='categorical_crossentropy', optimizer=opt, metrics=['accuracy'])
model.fit([d1, d2, d3, d4, d5, d6], label, validation_split=0.2, batch_size=25, epochs=30)

The way you are calling Concatenate Function is not correct. Concatenate expects one argument which specifies the axis of concatenation. What you are trying to achieve can be done using keras's functional API. just change the following code

model = Concatenate([ model1, model2, model3, model4, model5, model6])
model.add(Dense(512, activation='relu'))
model.add(Dropout(0.75))
model.add(Dense(1024, activation='relu'))
model.add(Dropout(0.75))

model.add(Dense(40, activation='softmax'))

to

merged = Concatenate()([ model1.output, model2.output, model3.output, model4.output, model5.output, model6.output])

merged = Dense(512, activation='relu')(merged)
merged = Dropout(0.75)(merged)
merged = Dense(1024, activation='relu')(merged)
merged = Dropout(0.75)(merged)

merged = Dense(40, activation='softmax')(merged)

model = Model(inputs=[model1.input, model2.input, model3.input, model4.input, model5.input, model6.input], outputs=merged)

NB

Though it is not the question being asked, I've noticed that you have been using very large dropout rate( But this may depend on the problem you are trying to solve). 0.75 drop rate means you are dropping 75% of the neurons while training. Please consider using small drop rate because the model might not converge.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM