[英]2.3 ratio between Pytorch BCEloss and my own “log” calculations
I'm scripting a toy model to both practice PyTorch and GAN models, and I'm making sure I understand each step as much as possible.我正在编写一个玩具 model 脚本来练习 PyTorch 和 GAN 模型,并且我确保我尽可能理解每个步骤。 That leaded me to checking my understanding of the BCEloss function, and apparently I understand it... with a ratio of 2.3.这使我检查了我对 BCEloss function 的理解,显然我理解它......比率为 2.3。
To check the results, I write the intermediate values for Excel:为了检查结果,我写了 Excel 的中间值:
tmp1 = y_pred.tolist() # predicted values in list (to copy/paste on Excel)
tmploss = nn.BCELoss(reduction='none') # redefining a loss giving the whole BCEloss tensor
tmp2 = tmploss(y_pred, y_real).tolist() # BCEloss values in list (to copy/paste Exel)
Then I copy tmp1
on Excel and calculate: -log(x)
for each values, which is the BCEloss formula for y_target = y_real = 1
.然后我在 Excel 上复制tmp1
并计算: -log(x)
每个值,这是y_target = y_real = 1
的 BCEloss 公式。
Then I compare the resulting values with the values of tmp2
: these values are 2.3x higher than "mine".然后我将结果值与tmp2
的值进行比较:这些值比“我的”高 2.3 倍。
(Sorry, I couldn't figure out how to format tables on this site...) (对不起,我不知道如何在这个网站上格式化表格......)
Can you please tell me what is happening?你能告诉我发生了什么吗? I feel a PEBCAK coming:-)我觉得 PEBCAK 来了:-)
This is because in Excel the Log function calculates the logarithm to the base 10. The standard definition of binary cross entropy uses a log function to the base e .这是因为在 Excel 中,日志function 计算以 10 为底的对数。二进制交叉熵的标准定义使用日志 ZC1C425268E68385D1AB5074C17A94F1 以e为底The ratio you're seeing is just log(10)=2.302585您看到的比率只是log(10)=2.302585
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.