[英]Keras Tensorflow Binary Cross entropy loss greater than 1
Library: Keras, backend:Tensorflow 库:Keras,后端:Tensorflow
I am training a single class/binary classification problem, wherein my final layer has a single node, with activation of sigmoid type. 我正在训练单个类/二进制分类问题,其中我的最后一层具有单个节点,且激活了S形。 I am compiling my model with a binary cross entropy loss.
我正在用二进制交叉熵损失来编译我的模型。 When I run the code to train my model, I notice that the loss is a value greater than 1. Is that right, or am I going wrong somewhere?
当我运行代码来训练模型时,我注意到损失的值大于1。这是正确的,还是我在某个地方出错了? I have checked the labels.
我检查了标签。 They're all 0s and 1s.
它们都是0和1。
Is it possible to have the binary cross entropy loss greater than 1? 二进制交叉熵损失是否可能大于1?
Keras binary_crossentropy
first convert your predicted probability to logits. Keras
binary_crossentropy
首先将您的预测概率转换为对数。 Then it uses tf.nn.sigmoid_cross_entropy_with_logits
to calculate cross entropy and return to you the mean of that. 然后,它使用
tf.nn.sigmoid_cross_entropy_with_logits
计算交叉熵,并将其平均值返回给您。 Mathematically speaking, if your label is 1 and your predicted probability is low (like 0.1), the cross entropy can be greater than 1, like losses.binary_crossentropy(tf.constant([1.]), tf.constant([0.1]))
. 从数学上来说,如果标签为1并且预测概率较低(如0.1),则交叉熵可以大于1,如
losses.binary_crossentropy(tf.constant([1.]), tf.constant([0.1]))
。
是的,它是正确的,交叉熵没有限制在任何特定范围内,只是正数(> 0)。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.