So i'm trying to build a cnn network. I have one hot encoded "scipy.sparse.coo.coo_matrix" of size "(109248, 101)". I need to build a two layer conv1D model with the given data and concatenate with another LSTM layer to process further.I am not getting the part to build the conv1D layers Any help would be appreciated....
I have tried the documentation using the following way to build the network.Also i have tried the functional way to build the network but it seems like i'm doing it wrong
So i tried this:
from keras.layers import Conv1D
# input_tensor = Input(shape=(None, 101))
model = Sequential()
model.add(Conv1D(input_shape=(101, 1),
filters=16,
kernel_size=4,
padding='same'))
model.add(Conv1D(filters=16, kernel_size=4))
model.add(Flatten())
and This
x_rest = Conv1D(input_shape=(101,1), filters=16, kernel_size=4, padding='same')
x2 = Conv1D(input_shape=(101,1), filters=16, kernel_size=4, padding='same')(x_rest)
out2 = Flatten()(x2)
Either of them Don't seem to work
There is always throws up error like
Layer concatenate_4 was called with an input that isn't a symbolic tensor. Received type: . Full input: [, ]. All inputs to the layer should be tensors.
This is the architecture i'm trying to build
Layer (type) Output Shape Param # Connected to
==================================================================================================
main_input (InputLayer) (None, 150) 0
__________________________________________________________________________________________________
rest_input (InputLayer) (None, 101, 1) 0
__________________________________________________________________________________________________
embedding_3 (Embedding) (None, 150, 300) 16873200 main_input[0][0]
__________________________________________________________________________________________________
conv1d_24 (Conv1D) (None, 99, 64) 256 rest_input[0][0]
__________________________________________________________________________________________________
lstm_3 (LSTM) (None, 150, 32) 42624 embedding_3[0][0]
__________________________________________________________________________________________________
conv1d_25 (Conv1D) (None, 97, 64) 12352 conv1d_24[0][0]
__________________________________________________________________________________________________
flatten_5 (Flatten) (None, 4800) 0 lstm_3[0][0]
__________________________________________________________________________________________________
flatten_7 (Flatten) (None, 6208) 0 conv1d_25[0][0]
__________________________________________________________________________________________________
concatenate_3 (Concatenate) (None, 11008) 0 flatten_5[0][0]
flatten_7[0][0]
__________________________________________________________________________________________________
dense_7 (Dense) (None, 1) 11009 concatenate_3[0][0]
__________________________________________________________________________________________________
dropout_3 (Dropout) (None, 1) 0 dense_7[0][0]
__________________________________________________________________________________________________
dense_8 (Dense) (None, 1) 2 dropout_3[0][0]
__________________________________________________________________________________________________
dense_9 (Dense) (None, 1) 2 dense_8[0][0]
__________________________________________________________________________________________________
main_output (Dense) (None, 1) 2 dense_9[0][0]
==================================================================================================
The first version of your code seems to be working. Here is the model it builds:
model.summary()
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv1d_3 (Conv1D) (None, 101, 16) 80
_________________________________________________________________
conv1d_4 (Conv1D) (None, 98, 16) 1040
_________________________________________________________________
flatten_1 (Flatten) (None, 1568) 0
=================================================================
Total params: 1,120
Trainable params: 1,120
Non-trainable params: 0
_________________________________________________________________
It seems that the problem is related to the LSTM layer you want yo use next (though I cannot help you since you did not provide this part of the code). You may find a solution here .
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.