This post almost does what I want. In a nutshell, the suggested solution is:
def custom_loss(y_true, y_pred):
# Your model exists in global scope global e
# Get the layers of your model
layers = [l for l in e.layers]
# Construct a graph to evaluate your other model on y_pred
eval_pred = y_pred
for i in range(len(layers)):
eval_pred = layers[i](eval_pred)
# Construct a graph to evaluate your other model on y_true
eval_true = y_true
for i in range(len(layers)):
eval_true = layers[i](eval_true)
# Now do what you wanted to do with outputs.
# Note that we are not returning the values, but a tensor.
return K.mean(K.square(eval_pred - eval_true), axis=-1)
In the function above, e
is a global argument, which is the model itself, and the custom loss function uses the model (which is global) without requiring the user to pass in the model. I'm not a big fan of global arguments. Is there a way to construct a custom_loss function such that it takes in the model object itself without using a global argument. For example, can I create a function custom_loss(y_true, y_pred, e)
and delete the line global e
, such that I can pass my custom_loss
as a loss function of a model?
Keras API does not support that. As the documentation states, loss functions take exactly two arguments: y_true
and y_pred
.
If you what such a feature, you have to modify Keras itself. Take a look at:
compile
function in keras/engine/training.py
weighted_masked_objective
function in keras/engine/training_utils.py
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.