[英]How can I calculate cross-entropy on a sigmoid neural network binary outcome?
I'm currently building a NN from scratch where we want to identify based on two input variables (X_1 and X_2) what their output will be (0 or 1).我目前正在从头开始构建一个 NN,我们希望根据两个输入变量(X_1 和 X_2)来识别它们的 output 将是(0 或 1)。 I have 2 hidden layers with sigmoid activation on all neurons however I get stuck when calculating the cross-entropy.我有 2 个隐藏层,所有神经元上都有 sigmoid 激活,但是在计算交叉熵时我卡住了。 Suppose in the output layer I have the predictions [0.50, 0.57]
however the true output is 0, so [1, 0]
.假设在 output 层我有预测[0.50, 0.57]
但是真正的 output 是 0,所以[1, 0]
。 How do I calculate the cross-entropy over this binary output example?如何计算此二进制 output 示例的交叉熵? Does anyone have any suggestions/tips?有没有人有任何建议/提示?
Here is a function that I wrote and use to calculate the cross-entropy given a list of predictions and a list of true labels.这是一个 function,我编写并使用它来计算给定预测列表和真实标签列表的交叉熵。
from math import log
# calculate the cross-entropy of predictions and true labels
def cross_entropy(y, p):
# y[i] is list of real labels
# p[i] is the probability of predicting 1
m = len(y)
sum_vals = 0
for i in range(m):
# first term is for label=1, second term is for label=0
sum_vals += float(y[i]) * log(p[i]) + (1 - float(y[i])) * log(1 - p[i])
R = -1/m * sum_vals
return R
Here the values in the list of labels y
are each either 0 or 1, and the probabilistic predictions from the network are in the list p
.这里标签列表y
中的值都是 0 或 1,来自网络的概率预测在列表p
中。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.