简体   繁体   English

ValueError:形状(无,无)和(无,无,无,3)不兼容

[英]ValueError: Shapes (None, None) and (None, None, None, 3) are incompatible

I am using EfficientNetV2B0, everything works perfectly until when I try to fit the model.我正在使用 EfficientNetV2B0,一切正常,直到我尝试拟合模型。 On the train_generator I tried the class_mode = 'sparse', and also class_mode = 'categorical', it throws different kinds of error messages.在 train_generator 上,我尝试了 class_mode = 'sparse' 和 class_mode = 'categorical',它会抛出不同类型的错误消息。 It took me few days, but I couldn't solve it out, can someone please help?搞了几天,还是没解决,有大神帮忙吗?

Here the full error message:这里是完整的错误信息:

ValueError: in user code:

File "/Users/ba/opt/anaconda3/envs/tensorflow/lib/python3.8/site-packages/keras/engine/training.py", line 1051, in train_function  *
    return step_function(self, iterator)
File "/Users/ba/opt/anaconda3/envs/tensorflow/lib/python3.8/site-packages/keras/engine/training.py", line 1040, in step_function  **
    outputs = model.distribute_strategy.run(run_step, args=(data,))
File "/Users/ba/opt/anaconda3/envs/tensorflow/lib/python3.8/site-packages/keras/engine/training.py", line 1030, in run_step  **
    outputs = model.train_step(data)
File "/Users/ba/opt/anaconda3/envs/tensorflow/lib/python3.8/site-packages/keras/engine/training.py", line 890, in train_step
    loss = self.compute_loss(x, y, y_pred, sample_weight)
File "/Users/ba/opt/anaconda3/envs/tensorflow/lib/python3.8/site-packages/keras/engine/training.py", line 948, in compute_loss
    return self.compiled_loss(
File "/Users/ba/opt/anaconda3/envs/tensorflow/lib/python3.8/site-packages/keras/engine/compile_utils.py", line 201, in __call__
    loss_value = loss_obj(y_t, y_p, sample_weight=sw)
File "/Users/ba/opt/anaconda3/envs/tensorflow/lib/python3.8/site-packages/keras/losses.py", line 139, in __call__
    losses = call_fn(y_true, y_pred)
File "/Users/ba/opt/anaconda3/envs/tensorflow/lib/python3.8/site-packages/keras/losses.py", line 243, in call  **
    return ag_fn(y_true, y_pred, **self._fn_kwargs)
File "/Users/ba/opt/anaconda3/envs/tensorflow/lib/python3.8/site-packages/keras/losses.py", line 1787, in categorical_crossentropy
    return backend.categorical_crossentropy(
File "/Users/ba/opt/anaconda3/envs/tensorflow/lib/python3.8/site-packages/keras/backend.py", line 5119, in categorical_crossentropy
    target.shape.assert_is_compatible_with(output.shape)

ValueError: Shapes (None, None) and (None, None, None, 3) are incompatible

Here is my code:这是我的代码:

x = base_model.layers[-6].output
x = Dense(1024,activation='relu')(x) #dense layer 1
x = Dense(512,activation='relu')(x) #dense layer 2
output = Dense(CLASSES, activation='softmax')(x) #final layer with softmax activation
model = Model(inputs=base_model.input, outputs=output)

I tried both CategoricalCrossentropy and SparseCategoricalCrossentropy我尝试了 CategoricalCrossentropy 和 SparseCategoricalCrossentropy

model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=0.0001), 
              loss=tf.keras.losses.CategoricalCrossentropy(), 
              metrics=['accuracy'])

history = model.fit(
    x=train_generator,
    steps_per_epoch = 54,
    epochs = EPOCH,
    validation_data = validation_generator,
    validation_steps = 6,
    verbose = 2,
    shuffle = True
)

When I try SparseCategoricalCrossentropy()当我尝试 SparseCategoricalCrossentropy()

model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=0.0001), 
                  loss=tf.keras.losses.SparseCategoricalCrossentropy(), 
                  metrics=['accuracy'])

It throws a different error message:它会引发不同的错误消息:

InvalidArgumentError: Graph execution error:

File "/Users/ba/opt/anaconda3/envs/tensorflow/lib/python3.8/site-packages/keras/backend.py", line 5238, in sparse_categorical_crossentropy
      res = tf.nn.sparse_softmax_cross_entropy_with_logits(
Node: 'sparse_categorical_crossentropy/SparseSoftmaxCrossEntropyWithLogits/SparseSoftmaxCrossEntropyWithLogits'
logits and labels must have the same first dimension, got logits shape [1568,3] and labels shape [32]
     [[{{node sparse_categorical_crossentropy/SparseSoftmaxCrossEntropyWithLogits/SparseSoftmaxCrossEntropyWithLogits}}]] [Op:__inference_train_function_26092]

The problem here is that you are feeding the loss function with two different structures: one is a 2 dimensional tensor (None, None) the other is a 4 dimensional tensor (None, None, None, 3)这里的问题是您使用两种不同的结构来提供损失函数:一种是二维张量(无,无),另一种是 4 维张量(无,无,无,3)

You will need to check how the couple <data, target> is, and make sure to feed the loss with two tensors having the same shape.您将需要检查 <data, target> 对的情况,并确保使用具有相同形状的两个张量来提供损失。 This is general for all the losses you are gonna use.这对于您将要使用的所有损失都是通用的。

The 2D Global average pooling block takes a tensor of size (input width) x (input height) x (input channels) and computes the average value of all values across the entire (input width) x (input height) matrix for each of the (input channels). 2D 全局平均池化块采用大小为(输入宽度)x(输入高度)x(输入通道)的张量,并计算整个(输入宽度)x(输入高度)矩阵中所有值的平均值(输入通道)。

Thus this code solved everything: x = GlobalAveragePooling2D()(x)因此这段代码解决了所有问题:x = GlobalAveragePooling2D()(x)

x = base_model.layers[-6].output
x = GlobalAveragePooling2D()(x)
x = Dense(1024,activation='relu')(x) #dense layer 1
x = Dense(512,activation='relu')(x) #dense layer 2
output = Dense(CLASSES, activation='softmax')(x) #final layer with softmax activation
model = Model(inputs=base_model.input, outputs=output)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM