简体   繁体   English

在 Keras TF 中处理和组合两个损失 function

[英]Handling and Combining two loss function in Keras TF

Is there a way to have two loss functions in Keras in which the second loss function takes the output from the first loss function? Is there a way to have two loss functions in Keras in which the second loss function takes the output from the first loss function?

I am working on a Neural Network with Keras and I want to add another custom function to the Loss term inside the model.compile() to regularize and somehow penalize it, which is the form:我正在使用 Keras 研究神经网络,我想在 model.compile() 中的损失项中添加另一个自定义 function 以对其进行正则化并以某种方式对其进行处理:

model.compile(loss_1='mean_squared_error', optimizer=Adam(lr=learning_rate), metrics=['mae'])

I would like to add another loss function as a sum of the predicted values from the Loss_1 outputs so that I can tell the Neural Network to minimize the sum of the predicted values from the Loss_1 model.我想添加另一个损失 function 作为来自 Loss_1 输出的预测值的总和,以便我可以告诉神经网络最小化来自 Loss_1 model 的预测值的总和。 How can I do that (loss_2)?我该怎么做(loss_2)?

Something like:就像是:

model.compile(loss_1='mean_squared_error', loss_2= np.sum(****PREDICTED_OUTPUT_FROM_LOSS_FUNCTION_1****), optimizer=Adam(lr=learning_rate), metrics=['mae'])

how can this be implemented?如何实施?

You should define a custom loss function您应该定义一个自定义损失 function

def custom_loss_function(y_true, y_pred):
   squared_difference = tf.square(y_true - y_pred)
   absolute_difference = tf.abs(y_true - y_pred)
   
   loss = tf.reduce_mean(squared_difference, axis=-1) + tf.reduce_mean(absolute_difference, axis=-1)

   return loss

model.compile(optimizer='adam', loss=custom_loss_function)

I believe that would solve your problem我相信这会解决你的问题

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM