简体   繁体   中英

Keras custom loss function outputs negative values, dont understand why?

Dear StackOverflow Community, I have the following loss function in keras:

return K.mean((y_true+K.epsilon()) * K.square(y_pred - y_true), axis=-1)

When I try to train my network (y normalized to 0 - 1) with it, the loss appears to get to an negative value, which I just can't understand. I calculated the same thing with numpy, and everything worked fine and as intended.

I would be really delighted If someone knows the cause for this weird negative solutions, so thank you for your help.

If y_true is really normalized to 0-1 that only possible cause that I see is K.epsilon() . As this page suggest epsilon can be changed by user and this can cause a problem.

Try to hardcode epsilon value or just throw it away.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM