简体   繁体   English

keras中的加权mse自定义丢失函数

[英]Weighted mse custom loss function in keras

I'm working with time series data, outputting 60 predicted days ahead. 我正在处理时间序列数据,预计输出60天。

I'm currently using mean squared error as my loss function and the results are bad 我目前正在使用均方误差作为我的损失函数,结果很糟糕

I want to implement a weighted mean squared error such that the early outputs are much more important than later ones. 我想实现加权均方误差,使得早期输出比后期输出重要得多。

Weighted Mean Square Root formula: 加权均方根公式:

加权均方根公式

So I need some way to iterate over a tensor's elements, with an index (since I need to iterate over both the predicted and the true values at the same time, then write the results to a tensor with only one element. They're both (?,60) but really (1,60) lists. 所以我需要一些方法来迭代张量的元素,带有索引(因为我需要同时迭代预测值和真值,然后将结果写入只有一个元素的张量。它们都是(?,60)但真的(1,60)列表。

And nothing I'm trying is working. 我正在尝试的任何东西都在起作用。 Here's the code for the broken version 这是破碎版本的代码

def weighted_mse(y_true,y_pred):
    wmse = K.cast(0.0,'float')

    size = K.shape(y_true)[0]
    for i in range(0,K.eval(size)):
        wmse += 1/(i+1)*K.square((y_true[i]-y_pred)[i])

    wmse /= K.eval(size)
    return wmse

I am currently getting this error as a result: 我目前正在收到此错误:

InvalidArgumentError (see above for traceback): You must feed a value for placeholder tensor 'dense_2_target' with dtype float
 [[Node: dense_2_target = Placeholder[dtype=DT_FLOAT, shape=[], _device="/job:localhost/replica:0/task:0/cpu:0"]()]]

Having read the replies to similar posts, I don't think a mask can accomplish the task, and looping over elements in one tensor would also not work since I'd not be able to access the corresponding element in the other tensor. 阅读了类似帖子的回复之后,我认为掩码不能完成任务,并且在一个张量中循环元素也行不通,因为我无法访问其他张量中的相应元素。

Any suggestions would be appreciated 任何建议,将不胜感激

You can use this approach: 你可以使用这种方法:

def weighted_mse(yTrue,yPred):

    ones = K.ones_like(yTrue[0,:]) #a simple vector with ones shaped as (60,)
    idx = K.cumsum(ones) #similar to a 'range(1,61)'


    return K.mean((1/idx)*K.square(yTrue-yPred))

The use of ones_like with cumsum allows you to use this loss function to any kind of (samples,classes) outputs. 使用ones_likecumsum可以将此损失函数用于任何类型的(samples,classes)输出。


Hint: always use backend functions when working with tensors. 提示:使用张量时始终使用后端功能 You can use slices, but avoid iterating. 您可以使用切片,但避免迭代。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM