简体   繁体   English

Keras Convolution1D输入形状

[英]Keras Convolution1D input shape

I am trying to build a simple Convolutional NN with: 我正在尝试使用以下方法构建一个简单的卷积神经网络:

  • 340 samples, 340个样本
  • 260 rows per sample 每个样本260行
  • 16 features per row. 每行16个功能。

I thought the order of the shape is (batch_size, steps, input_dim), which would mean (340, 16, 260) I believe . 我认为形状的顺序是(batch_size,steps,input_dim), 我相信这意味着(340,16,260)。

Current code: 当前代码:

model = Sequential()
model.add(Conv1D(64, kernel_size=3, activation='relu', input_shape=(340, 16, 260)))
# model.add(Conv2D(64, 2, activation='relu'))
model.add(MaxPooling1D())
# model.add(Conv2D(128, 2, activation='relu'))
model.add(Conv1D(64, kernel_size=3, activation='relu'))
model.add(GlobalAveragePooling1D())
model.add(Dense(1, activation='linear'))

model.compile(loss='binary_crossentropy',
              optimizer='rmsprop',
              metrics=['accuracy'])

model.summary()

model.fit(xTrain, yTrain, batch_size=16, epochs=1000)

I am getting an error: 我收到一个错误:

ValueError: Input 0 is incompatible with layer conv1d_1: expected ndim=3, found ndim=4 ValueError:输入0与层conv1d_1不兼容:预期ndim = 3,找到的ndim = 4

I am very lost and believe that my shapes are off. 我很失落,相信自己的形状不对。 Could someone help me? 有人可以帮我吗? Thank you! 谢谢!

As mentioned in this answer , layers in Keras, accept two arguments: input_shape and batch_input_shape . 该答案所述input_shape图层接受两个参数: input_shapebatch_input_shape The difference is that input_shape does not contain the batch size, while batch_input_shape is the full input shape , including the batch size. 区别在于input_shape 包含批次大小,而batch_input_shape完整的输入形状 ,包括批次大小。

Based on this, I think the specification input_shape=(340, 16, 260) tells keras to expect a 4-dimensional input, which is not what you want. 基于此,我认为规范input_shape=(340, 16, 260)告诉keras期望4维输入,这不是您想要的。 The correct argument would be batch_input_shape=(340, 16, 260) . 正确的参数应该是batch_input_shape=(340, 16, 260)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM