简体   繁体   English

用于图像分割的张量流的sigmoid_cross_entropy损失函数

[英]sigmoid_cross_entropy loss function from tensorflow for image segmentation

I'm trying to understand what the sigmoid_cross_entropy loss function does with regards to image segmentation neural networks: 我试图理解sigmoid_cross_entropy损失函数对图像分割神经网络的作用:

Here is the relevant Tensorflow source code : 以下是相关的Tensorflow源代码

zeros = array_ops.zeros_like(logits, dtype=logits.dtype)
cond = (logits >= zeros)
relu_logits = array_ops.where(cond, logits, zeros)
neg_abs_logits = array_ops.where(cond, -logits, logits)
return math_ops.add(
    relu_logits - logits * labels,
    math_ops.log1p(math_ops.exp(neg_abs_logits)), name=name)

My main question is why is there a math_ops.add() at the return? 我的主要问题是为什么返回时会有math_ops.add() Is the add referring to the summation of the loss for every pixel in the image or is the summation doing something different? 添加是指图像中每个像素的损失总和还是执行不同的总和? I'm not able to properly follow the dimensional changes to deduce what the summation is doing. 我无法正确地遵循尺寸变化来推断总和正在做什么。

sigmoid_cross_entropy_with_logits is used in multilabel classification. sigmoid_cross_entropy_with_logits用于多sigmoid_cross_entropy_with_logits分类。

The whole problem can be divided into binary cross-entropy loss for the class predictions that are independent(eg 1 is both even and prime). 对于独立的类预测,整个问题可以分为二元交叉熵损失(例如,1是偶数和素数)。 Finaly collect all prediction loss and average them. 最终收集所有预测损失并对其进行平均。

Below is an example: 以下是一个例子:

import tensorflow as tf


logits = tf.constant([[0, 1],
                      [1, 1],
                      [2, -4]], dtype=tf.float32)
y_true = tf.constant([[1, 1],
                      [1, 0],
                      [1, 0]], dtype=tf.float32)
# tensorflow api
loss = tf.losses.sigmoid_cross_entropy(multi_class_labels=y_true,
                                       logits=logits)

# manul computing
probs = tf.nn.sigmoid(logits)
loss_t = tf.reduce_mean(y_true * (-tf.log(probs)) +
                        (1 - y_true) * (-tf.log(1 - probs)))

config = tf.ConfigProto()
config.gpu_options.allow_growth = True  # pylint: disable=no-member
with tf.Session(config=config) as sess:
    loss_ = loss.eval()
    loss_t_ = loss_t.eval()
    print('sigmoid_cross_entropy: {: .3f}\nmanual computing: {: .3f}'.format(
        loss_, loss_t_))
------------------------------------------------------------------------------
#output: 
    sigmoid_cross_entropy:  0.463
    manual computing:  0.463

In this case math_ops.add() corresponds to tf.add(x,y) which is just adding two Tensors of the same size together, the dimension of the result is the same as the arguments. 在这种情况下, math_ops.add()对应于tf.add(x,y) ,它只是将两个相同大小的张量相加,结果的维度与参数相同。

When you use sigmoid_cross_entropy_with_logits for a segmentation task you should do something like this: 当您使用sigmoid_cross_entropy_with_logits进行分段任务时,您应该执行以下操作:

loss = tf.nn.sigmoid_cross_entropy_with_logits(labels=labels, logits=predictions)

Where labels is a flattened Tensor of the labels for each pixel, and logits is the flattened Tensor of predictions for each pixel. labels是每个像素labels的平坦张量,而logits是每个像素的平坦预测张量。

It returns loss , a Tensor containing the individual loss for each pixel. 它返回loss ,Tensor包含每个像素的个体损失。 Then, you can use 然后,你可以使用

loss_mean = tf.reduce_mean(loss)

to average the losses of all individual pixels to get the final loss. 平均所有单个像素的损失以获得最终损失。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何在Tensorflow中对S形交叉熵损失函数应用权重? - How to apply weights to a sigmoid cross entropy loss function in Tensorflow? Tensorflow:Sigmoid交叉熵损失不会强制网络输出为0或1 - Tensorflow: Sigmoid cross entropy loss does not force network outputs to be 0 or 1 TensorFlow v10重新制定了S形交叉熵损失以在给定S形的情况下工作? - TensorFlow v10 reformulate sigmoid cross entropy loss to work given sigmoid? 在 Pytorch 上使用 sigmoid 输出进行交叉熵损失 - Using sigmoid output for cross entropy loss on Pytorch 张量流是否支持计算用于语义分割的标签子集的交叉熵损失? - Does tensorflow support to calculate the cross entropy loss of a subset of labels for semantic segmentation? Tensorflow:具有交叉熵损失的加权稀疏softmax - Tensorflow: Weighted sparse softmax with cross entropy loss 具有一维数据Logits的TensorFlow Sigmoid交叉熵 - TensorFlow Sigmoid Cross Entropy with Logits for 1D data 如何在 TensorFlow 中选择交叉熵损失? - How to choose cross-entropy loss in TensorFlow? tensorflow softmax_cross_entropy_with_logits和sigmoid_cross_entropy_with_logits之间的实现差异 - Difference of implementation between tensorflow softmax_cross_entropy_with_logits and sigmoid_cross_entropy_with_logits Tensorflow加权交叉熵损失函数在DNN分类器估算器函数中的哪个位置? - Where does Tensorflow Weighted Cross Entropy loss function goes in the DNN Classifier Estimator function?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM