[英]Keras custom loss function - shape mismatch despite returning same shape as categorical crossentropy
I've created a custom loss function based on cosine:我创建了一个基于余弦的自定义损失 function:
def cos_loss(y_true, y_pred):
norm_pred = tf.math.l2_normalize(y_pred)
dprod = tf.tensordot(
a=y_true,
b=norm_pred,
axes=1
)
return 1 - dprod
However, training a model with this custom loss results in the error In[0] mismatch In[1] shape: 2 vs. 8: [8,2] [8,2] 0 0
.但是,使用此自定义损失训练 model 会导致错误In[0] mismatch In[1] shape: 2 vs. 8: [8,2] [8,2] 0 0
。 If I use a built-in loss function like categorical cross-entropy, the model trains without issue.如果我使用像分类交叉熵这样的内置损失 function,则 model 训练没有问题。
This is despite my custom loss and categorical crossentropy returning values that are exactly the same type and shape.尽管我的自定义损失和分类交叉熵返回值的类型和形状完全相同。 For example, I create testing y_true
and y_pred
and run them through both:例如,我创建了测试y_true
和y_pred
并运行它们:
test_true = np.asarray([1.0, 0.0])
test_pred = np.asarray([0.9, 0.2])
print(cos_loss(test_true, test_pred))
print(tf.keras.losses.categorical_crossentropy(test_true, test_pred))
which returns:返回:
> tf.Tensor(0.023812939816047263, shape=(), dtype=float64)
tf.Tensor(0.20067069546215124, shape=(), dtype=float64)
So both give TF tensors with a single float-64 value and no shape.因此,两者都给出了具有单个 float-64 值且没有形状的 TF 张量。 So why am I getting a shape mismatch error on one but not the other if the shape outputs are the same please?那么,如果形状输出相同,为什么我会在一个上出现形状不匹配错误,而在另一个上却没有呢? Thanks.谢谢。
Your loss function should be able to take in a batch of predictions and ground truth and return a batch of loss values.您的损失 function 应该能够接受一批预测和基本事实并返回一批损失值。 At the moment, that's not the case, as a tensordot
with axis=1
is a matrix multiplication, and you have a conflict of dimensions when you start to introduce a batch dimension.目前,情况并非如此,因为tensordot
axis=1
的张量点是矩阵乘法,当您开始引入批量维度时,您会遇到维度冲突。
You can probably use the following instead:您可能可以改用以下内容:
def cos_loss(y_true, y_pred):
norm_pred = tf.math.l2_normalize(y_pred)
dprod = tf.reduce_sum(y_true*norm_pred, axis=-1)
return 1 - dprod
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.