简体   繁体   English

如何在tensorflow会话中使用mean_squared_error损失

[英]How to use mean_squared_error loss in tensorflow session

I am new to tensorflow 我是tensorflow的新手

In a part of a code for a tensorflow session, there is : 在张量流会话的代码的一部分中,有:

 loss = tf.nn.softmax_cross_entropy_with_logits_v2(
                    logits=net, labels=self.out_placeholder, name='cross_entropy')
                self.loss = tf.reduce_mean(loss, name='mean_squared_error')

I want to use mean_squared_error loss function for this purpose. 我想为此使用mean_squared_error损失函数。 I found this loss function in tensorflow website: 我在Tensorflow网站上发现了这种损失功能:

tf.losses.mean_squared_error(
labels,
predictions,
weights=1.0,
scope=None,
loss_collection=tf.GraphKeys.LOSSES,
reduction=Reduction.SUM_BY_NONZERO_WEIGHTS
)

I need this loss function for a regression problem. 对于回归问题,我需要此损失函数。

I tried: 我试过了:

loss = tf.losses.mean_squared_error(predictions=net, labels=self.out_placeholder)
self.loss = tf.reduce_mean(loss, name='mean_squared_error')

Where net = tf.matmul(input_tensor, weights) + biases 其中net = tf.matmul(input_tensor, weights) + biases

However, I'm not sure if it's the correct way. 但是,我不确定这是否正确。

First of all keep in mind that cross-entropy is mainly used for classification, while MSE is used for regression. 首先请记住,交叉熵主要用于分类,而MSE用于回归。

In your case cross entropy measures the difference between two distributions (the real occurences, called labels - and your predictions) 在您的情况下,交叉熵可衡量两种分布之间的差异(实际发生的情况,称为标签-和您的预测)

So while the first loss functions works on the result of the softmax layer (which can be seen as a probability distribution), the second one works directly on the floating point output of your network (which is no probability distribution) - therefore they cannot be simply exchanged. 因此,虽然第一个损失函数作用于softmax层的结果(可以看作是概率分布),但是第二个损失函数直接作用于网络的浮点输出(没有概率分布),因此不能简单地交换。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Tensorflow会话运行不适用于mean_squared_error - Tensorflow session run not working with mean_squared_error 张量流中没有提供渐变(mean_squared_error) - No gradients provided in tensorflow (mean_squared_error) Tensorflow均方误差损失函数 - Tensorflow mean squared error loss function 在这两段代码中计算的 mean_squared_error 有什么区别? 我如何比较 metric='mse' 和 mean_squared_error? - what is the difference between the mean_squared_error computed in both of these pieces of code? how can I compare metric='mse' to mean_squared_error? conv_lstm.py示例使用'binary_crossentropy'损失进行回归。 为什么不使用“ mean_squared_error”呢? - conv_lstm.py example uses 'binary_crossentropy' loss for regression. Why not using 'mean_squared_error' instead? sklean mean_squared_error 忽略平方参数, multioutput='raw_values' - sklean mean_squared_error ignores the squared argument, with multioutput='raw_values' 由 TensorFlow 计算的均方误差 - Mean squared error computed by TensorFlow TensorFlow 中的加权均方误差 - Weighted Mean Squared Error in TensorFlow Keras均方误差丢失层 - Keras mean squared error loss layer TensorFlow 中图像分类模型的均方误差 - Mean Squared Error for image classification model in TensorFlow
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM