[英]Why layer batch_normalization_6 is incompatible with the layer?
I want to train feature of size (10151, 1285) to lable (10151, 257), and I want to use way2.我想将尺寸(10151、1285)的特征训练为标签(10151、257),并且我想使用way2。 since in I want to use "feature_input" in the cost function.因为我想在成本 function 中使用“feature_input”。 but it fails with error:但它失败并出现错误:
ValueError: Input 0 of layer batch_normalization_6 is incompatible with the layer: expected ndim=3, found ndim=2. ValueError: 层 batch_normalization_6 的输入 0 与层不兼容:预期 ndim=3,发现 ndim=2。 Full shape received: (None, 257).收到的完整形状:(无,257)。
I am wondering why?我想知道为什么?
Way1:
model = Sequential()
model.add(Dense(257, input_dim=1285))
model.add(BatchNormalization())
model.add(Activation('sigmoid'))
model.compile(optimizer='adam', loss='mse', metrics=['mse'])
model.fit(feature, label )
model.save("./model.hdf5")
Way2:
feature_input = Input(shape=(None, 1285))
dense = Dense(257)(feature_input)
norm = BatchNormalization()(dense)
out = Activation('sigmoid')(norm)
model = Model(feature_input, out)
model.compile(optimizer='adam', loss='mse', metrics=['mse'])
model.fit(feature, label )
model.save("./model.hdf5")
If you define the input shape as (None, 1285)
, the model recognizes the input as a 3-dimensional data.如果将输入形状定义为(None, 1285)
,则 model 会将输入识别为 3 维数据。 I guess the None
shape you entered was meant to describe the batch size, but when we compile the model we get a 3-dimensional input, and the batch dimension is automatically added.我猜您输入的None
形状是为了描述批量大小,但是当我们编译 model 时,我们会得到一个 3 维输入,并且会自动添加批量维度。 Therefore, you can use an input shape of (1285,)
as an alternative.因此,您可以使用(1285,)
的输入形状作为替代。
<Summary of your model>
Model: "model"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) [(None, None, 1285)] 0
_________________________________________________________________
dense (Dense) (None, None, 257) 330502
_________________________________________________________________
batch_normalization (BatchNo (None, None, 257) 1028
_________________________________________________________________
activation (Activation) (None, None, 257) 0
=================================================================
Total params: 331,530
Trainable params: 331,016
Non-trainable params: 514
_________________________________________________________________
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.