简体   繁体   中英

How to take input parallel layers in keras

I am trying to build a model for classifying 12 lead ECG signals. I want that, each lead firstly goes through a different conv1D layer, and then I want to concatenate them all. I don't know how I can split the input while providing it to the model. Here is what I have tried, but got the error:


input = keras.Input(shape=(1000, 12))

conv1=(layers.Conv1D(32,(7),activation='relu'))
conv1=conv1(input[:,:,0])

conv2=(layers.Conv1D(32,(7),activation='relu'))
conv2=conv2(input[:,:,1])

conv3=(layers.Conv1D(32,(7),activation='relu'))
conv3=conv3(input[:,:,2])

conv4=(layers.Conv1D(32,(7),activation='relu'))
conv4=conv4(input[:,:,3])

conv5=(layers.Conv1D(32,(7),activation='relu'))
conv5=conv5(input[:,:,4])

conv6=(layers.Conv1D(32,(7),activation='relu'))
conv6=conv6(input[:,:,5])

conv7=(layers.Conv1D(32,(7),activation='relu'))
conv7=conv7(input[:,:,6])

conv8=(layers.Conv1D(32,(7),activation='relu'))
conv8=conv8(input[:,:,7])

conv9=(layers.Conv1D(32,(7),activation='relu'))
conv9=conv9(input[:,:,8])

conv10=(layers.Conv1D(32,(7),activation='relu'))
conv10=conv10(input[:,:,9])

conv11=(layers.Conv1D(32,(7),activation='relu'))
conv11=conv11(input[:,:,10])

conv12=(layers.Conv1D(32,(7),activation='relu'))
conv12=conv12(input[:,:,11])

conv13=tf.keras.layers.concatenate([conv1,conv2,conv3,conv4,conv5,conv6,conv7,conv8,conv9,conv10,conv11,conv12], axis=2)



The Error says:


ValueError: Input 0 of layer conv1d_6 is incompatible with the layer: expected ndim=3, found ndim=2. Full shape received: [None, 1000]

Can anyone please help me with this issue?

The error message explains the error, input[:,:,11] collapses the tensor from 3 to 2 dimensions so you need to use input[:,:,11:12] or tf.expand_dims(input[:,:,11], axis=-1) to preserve the last dimension.

import tensorflow as tf
from tensorflow.keras.layers import Conv1D

n_layers = 12
n_filters = 32
k = 7
activation_fn = 'relu'


input = tf.keras.Input(shape=(1000, n_layers))
layers = {
    f'conv{i}': Conv1D(n_filters, (k), activation=activation_fn) for i in range(n_layers)
}

channels = [layers[f'conv{i}'](input[:,:,i:i+1]) for i in range(n_layers)]

convcat = tf.keras.layers.concatenate(channels, axis=2)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM