简体   繁体   中英

TF/KERAS : passing a list as loss for single output

I have only one output for my model, but I would like to combine two different loss functions, ( Note: number of CLASSES = 24 ).

c = 0.8
lamda = 32

# My personalized loss function
def selective_loss(y_true, y_pred): .
    loss = K.categorical_crossentropy(
        K.repeat_elements(y_pred[:, -1:], CLASSES, axis=1) * y_true[:, :-1],
        y_pred[:, :-1]) + lamda * K.maximum(-K.mean(y_pred[:, -1]) + c, 0) ** 2
    return loss

p = np.ones(CLASSES) / CLASSES#The weights of class.

#And doing de model compile.
model.compile(loss = ['categorical_crossentropy', selective_loss],
              loss_weights = p,
              optimizer= sgd,                            
              metrics = ['accuracy'])

But it complains that I need two outputs because I defined two losses:

ValueError: When passing a list as loss, it should have one entry per model outputs. The model has 1 outputs, but you passed loss=['categorical_crossentropy', <function selective_loss at 0x7fcfb68daa60>]

Would you have to combine the two losses into one? And if so, how would you do it?

Or better to have two outputs? Would this affect the prediction? How would it be?

I perform a weighting between the two loss functions, use alpha 0.5 but other float is valid too:

#Private loss is the selective_loss.
def total_loss(y_true, y_pred):
    alpha = 0.5
    return (1-alpha)*categorical_crossentropy(y_true, y_pred) + alpha*selective_loss(y_true, y_pred)

#Compile the model with weighting loss.
model.compile(loss = total_loss,
              loss_weights = p,
              optimizer= sgd,                            
              metrics = ['accuracy'])

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM