简体   繁体   English

为什么使用tensorflow的估计器高级API和原始API的mnist分类的交叉熵损失在规模上不同?

[英]why the cross entropy loss of mnist classification using tensorflow's estimator high-level API and raw API are different in scale?

I am reading some tensorflow example codes and I find the loss in CNN-using-estimatorAPI and the loss in raw CNN are really different in scale, but they are all the same loss function: 我正在阅读一些tensorflow示例代码,并且发现CNN-using-estimatorAPI中的损失和原始CNN中的损失在规模上确实有所不同,但它们都是相同的损失函数:

the former is loss_op = tf.reduce_mean(tf.nn.sparse_softmax_cross_entropy_with_logits(logits=logits_train, labels=tf.cast(labels, dtype=tf.int32))) , which use the not-one-hot label. 前者是loss_op = tf.reduce_mean(tf.nn.sparse_softmax_cross_entropy_with_logits(logits=logits_train, labels=tf.cast(labels, dtype=tf.int32))) ,后者使用的不是一个热门标签。

the latter is loss_op =tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=Y)) , which use the one-hot vector label. 后者是loss_op =tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=Y)) ,它使用单热矢量标签。

why the former loss is nearlly 0 ~ 2.39026, and the latter loss is much bigger, why is it? 为什么前者的损失几乎为0〜2.39026,而后者的损失要大得多,为什么呢?

我知道,这是因为变量初始值设定项有所不同,tf.layers。*的默认值不是tf.random_normal(),对于更大的损失,这是因为softmax_cross_entropy_with_logits中log(0)的内部处理机制认为较低的损失更准确,因为log(1e-5)=-11。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何使用Keras作为高级API在Tensorflow上实现批量规范化 - How to implement Batch Normalization on tensorflow with Keras as a high-level API 在没有高级API的情况下重新训练CNN - Retraining a CNN without a high-level API 使用高级API tf.contrib.learn.DNNClassifier时Tensorflow批处理大小是多少 - what is the Tensorflow batch size when you use high-level API tf.contrib.learn.DNNClassifier 尝试在Tensorflow高级API上编写我自己的损失函数 - Trying to write my own loss function on Tensorflow high level API 实现二元交叉熵损失给出了与 Tensorflow 不同的答案 - Implementing Binary Cross Entropy loss gives different answer than Tensorflow's Tensorflow加权交叉熵损失函数在DNN分类器估算器函数中的哪个位置? - Where does Tensorflow Weighted Cross Entropy loss function goes in the DNN Classifier Estimator function? 为什么softmax交叉熵损失在张量流中永远不会给出零值? - Why softmax cross entropy loss never gives a value of zero in tensorflow? 如何使用sparse_softmax_cross_entropy_with_logits在tensorflow中实现加权交叉熵损失 - How can I implement a weighted cross entropy loss in tensorflow using sparse_softmax_cross_entropy_with_logits Tensorflow:具有交叉熵损失的加权稀疏softmax - Tensorflow: Weighted sparse softmax with cross entropy loss 如何在 TensorFlow 中选择交叉熵损失? - How to choose cross-entropy loss in TensorFlow?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM