简体   繁体   English

TensorFlow中的自定义损失函数用于对训练数据进行加权

[英]Custom Loss Function in TensorFlow for weighting training data

I want to weight the training data based on a column in the training data set. 我想根据训练数据集中的一列对训练数据进行加权。 Thereby giving more importance to certain training items than others. 因此,某些培训项目比其他培训项目更加重要。 The weighting column should not be included as a feature for the input layer. 加权列不应作为输入图层的要素包括在内。

The Tensorflow documentation holds an example how to use the label of the item to assign a custom loss and thereby assigning weight: Tensorflow文档提供了一个示例,说明如何使用商品的标签分配自定义损失并由此分配重量:

# Ensures that the loss for examples whose ground truth class is `3` is 5x
# higher than the loss for all other examples.
weight = tf.multiply(4, tf.cast(tf.equal(labels, 3), tf.float32)) + 1

onehot_labels = tf.one_hot(labels, num_classes=5)
tf.contrib.losses.softmax_cross_entropy(logits, onehot_labels, weight=weight)

I am using this in a custom DNN with three hidden layers. 我在具有三个隐藏层的自定义DNN中使用此功能。 In theory i simply need to replace labels in the example above with a tensor containing the weight column. 从理论上讲,我只需要用包含weight列的张量替换上面示例中的标签

I am aware that there are several threads that already discuss similar problems eg defined loss function in tensorflow? 我知道有几个线程已经在讨论类似的问题,例如张量流中定义的损失函数?

For some reason i am running into a lot of problems trying to bring my weight column in. It's probably two easy lines of code or maybe there is an easier way to achieve the same result. 由于某些原因,我在尝试引入我的体重专栏时遇到了很多问题。这可能是两行简单的代码,或者也许有一种更简单的方法来获得相同的结果。

I believe i found the answer: 我相信我找到了答案:

  weight_tf = tf.range(features.get_shape()[0]-1, features.get_shape()[0])
  loss = tf.losses.softmax_cross_entropy(target, logits, weights=weight_tf)

The weight is the last column in the features tensorflow. 权重是特征tensorflow中的最后一列。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM