简体   繁体   English

为什么层batch_normalization_6与层不兼容?

[英]Why layer batch_normalization_6 is incompatible with the layer?

I want to train feature of size (10151, 1285) to lable (10151, 257), and I want to use way2.我想将尺寸(10151、1285)的特征训练为标签(10151、257),并且我想使用way2。 since in I want to use "feature_input" in the cost function.因为我想在成本 function 中使用“feature_input”。 but it fails with error:但它失败并出现错误:

ValueError: Input 0 of layer batch_normalization_6 is incompatible with the layer: expected ndim=3, found ndim=2. ValueError: 层 batch_normalization_6 的输入 0 与层不兼容:预期 ndim=3,发现 ndim=2。 Full shape received: (None, 257).收到的完整形状:(无,257)。

I am wondering why?我想知道为什么?

Way1:
model = Sequential()
model.add(Dense(257, input_dim=1285))
model.add(BatchNormalization())
model.add(Activation('sigmoid'))

model.compile(optimizer='adam', loss='mse',  metrics=['mse'])
model.fit(feature, label )
model.save("./model.hdf5")

Way2:
feature_input = Input(shape=(None, 1285))
dense = Dense(257)(feature_input)
norm = BatchNormalization()(dense)
out = Activation('sigmoid')(norm)
model = Model(feature_input, out)

model.compile(optimizer='adam', loss='mse',  metrics=['mse'])
model.fit(feature, label )
model.save("./model.hdf5")

If you define the input shape as (None, 1285) , the model recognizes the input as a 3-dimensional data.如果将输入形状定义为(None, 1285) ,则 model 会将输入识别为 3 维数据。 I guess the None shape you entered was meant to describe the batch size, but when we compile the model we get a 3-dimensional input, and the batch dimension is automatically added.我猜您输入的None形状是为了描述批量大小,但是当我们编译 model 时,我们会得到一个 3 维输入,并且会自动添加批量维度。 Therefore, you can use an input shape of (1285,) as an alternative.因此,您可以使用(1285,)的输入形状作为替代。

<Summary of your model>

Model: "model"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_1 (InputLayer)         [(None, None, 1285)]      0         
_________________________________________________________________
dense (Dense)                (None, None, 257)         330502    
_________________________________________________________________
batch_normalization (BatchNo (None, None, 257)         1028      
_________________________________________________________________
activation (Activation)      (None, None, 257)         0         
=================================================================
Total params: 331,530
Trainable params: 331,016
Non-trainable params: 514
_________________________________________________________________

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Python Keras 层batch_normalization的输入0与层不兼容 - Python Keras Input 0 of layer batch_normalization is incompatible with the layer ValueError:输入 0 与层 batch_normalization_1 不兼容:预期 ndim=3,发现 ndim=2 - ValueError: Input 0 is incompatible with layer batch_normalization_1: expected ndim=3, found ndim=2 关于 DNN model 中 Dropout 层和 Batch Normalization 层的问题 - Question About Dropout Layer and Batch Normalization Layer in DNN model 批量归一化层和演化归一化激活层有什么用 - What is the use of Batch Normalization Layer and Evolving normalization activation layers 为什么在下一层为relu时禁用tf.layers.batch_normalization的参数“ scale”? - why the parameter 'scale' of tf.layers.batch_normalization is disabled when next layer is relu? 为什么在微调时需要冻结 Batch Normalization 层的所有内部 state - Why it's necessary to frozen all inner state of a Batch Normalization layer when fine-tuning 如何为 tensorflow 多 GPU 代码实现批量归一化层 - How to implement batch normalization layer for tensorflow multi-GPU code 如何在TensorFlow中使用官方批量标准化层? - How does one use the official Batch Normalization layer in TensorFlow? keras 中的层规范化重命名 - Layer Normalization renaming in keras Tensorflow中的批量标准化层未更新其移动平均值和移动方差 - Batch normalization layer in Tensorflow is not updating its moving mean and moving variance
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM