Library: Keras, backend:Tensorflow
I am training a single class/binary classification problem, wherein my final layer has a single node, with activation of sigmoid type. I am compiling my model with a binary cross entropy loss. When I run the code to train my model, I notice that the loss is a value greater than 1. Is that right, or am I going wrong somewhere? I have checked the labels. They're all 0s and 1s.
Is it possible to have the binary cross entropy loss greater than 1?
Keras binary_crossentropy
first convert your predicted probability to logits. Then it uses tf.nn.sigmoid_cross_entropy_with_logits
to calculate cross entropy and return to you the mean of that. Mathematically speaking, if your label is 1 and your predicted probability is low (like 0.1), the cross entropy can be greater than 1, like losses.binary_crossentropy(tf.constant([1.]), tf.constant([0.1]))
.
是的,它是正确的,交叉熵没有限制在任何特定范围内,只是正数(> 0)。
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.