繁体   English   中英

SparseCategoricalCrossentropy 形状不匹配

[英]SparseCategoricalCrossentropy Shape Mismatch

我想对 SparseCategoricalCrossentropy 函数做一个简单的测试,看看它到底对输出做了什么。 为此,我使用 MobileNetV2 的最后一层的输出。

    import keras.backend as K

    full_model = tf.keras.applications.MobileNetV2(
    input_shape=(224,224,3),
    alpha=1.0,
    include_top=True,
    weights="imagenet",
    input_tensor=None,
    pooling=None,
    classes=1000,
    classifier_activation="softmax",)

    func = K.function(full_model.layers[1].input, full_model.layers[155].output)
    conv_output = func([processed_image])
    y_pred = np.single(conv_output)
    
    y_true = np.zeros(1000).reshape(1,1000)
    y_true[0][282] = 1
    
    scce = tf.keras.losses.SparseCategoricalCrossentropy()
    scce(y_true, y_pred).numpy()

processed_image是先前创建的1x224x224x3阵列。

我收到错误ValueError: Shape mismatch: The shape of labels (received (1000,)) should equal the shape of logits except for the last dimension (received (1, 1000)).

我尝试重塑数组以匹配提到的错误的维度,但它似乎不起作用。 它接受什么形状?

由于您使用的是SparseCategoricalCrossentropy损失函数,因此y_true的形状应为[batch_size] ,而y_pred的形状应为[batch_size, num_classes] 此外, y_true应该由整数值组成。 请参阅文档 在你的具体例子中,你可以尝试这样的事情:

import keras.backend as K
import tensorflow as tf
import numpy as np

full_model = tf.keras.applications.MobileNetV2(
             input_shape=(224,224,3),
             alpha=1.0,
             include_top=True,
             weights="imagenet",
             input_tensor=None,
             pooling=None,
             classes=1000,
             classifier_activation="softmax",)

batch_size = 1
processed_image = tf.random.uniform(shape=[batch_size,224,224,3])
func = K.function(full_model.layers[1].input, 
full_model.layers[155].output)
conv_output = func([processed_image])
y_pred = np.single(conv_output)

# Generates an integer between 0 and 999 representing a class index.
y_true = np.random.randint(low = 0, high = 999, size = batch_size)
# [984]
scce = tf.keras.losses.SparseCategoricalCrossentropy() 
scce(y_true, y_pred).numpy()
# y_pred encodes a probability distribution here and the calculated loss is 10.69202

您可以尝试使用batch_size来查看一切是如何工作的。 在上面的例子中,我只使用了 1 的batch_size

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM