I need to have a custom Keras loss function. The following code works with loss function loss2 but not using loss function loss1 . I am getting errors as
OperatorNotAllowedInGraphError: using a tf.Tensor
as a Python bool
is not allowed in Graph execution. Use Eager execution or decorate this function with @tf.function.
I have decorated the function with @tf.function but it does not work.
import numpy as np
from keras import backend as K
from keras.models import Sequential
from keras.layers import Dense, LeakyReLU
def individual_model(keys, labels):
model = Sequential()
size = 32
model.add(Dense(32, input_dim=1))
model.add(LeakyReLU())
for i in range(2):
model.add(Dense(size))
model.add(LeakyReLU())
model.add(Dense(1))
model.compile(optimizer='adam', loss=my_loss, metrics=[
my_loss])
model.fit(keys, labels, epochs=256, batch_size=32, verbose=1)
return model
def loss1(v):
if v<0:
return -100 * v
else:
return v
def loss2(v):
return v*v
def my_loss(y_true, y_pred):
return K.map_fn(loss1,y_true-y_pred) #with loss2 this works
x=np.random.exponential(100,1000)
x.sort()
labels=np.arange(0,1000)
m=individual_model(x,labels)
tf.where could be used in loss v1 definition something like this link :
import tensorflow as tf
import numpy as np
arr = np.array([1., -1])
sess = tf.Session()
print(sess.run(tf.where(arr<0., arr*100., arr*10.))
Output:
array([ 10., -100.])
Based on @Engineero recommendation.
The following code works
def cond_switch(x):
return K.relu(-x)*10 + K.relu(x)
def my_loss(y_true, y_pred):
return cond_switch(y_true-y_pred)
I am still trying to figure out the reason.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.