[英]Using Keras layers inside custom loss function
Is it possible to use a Keras layer (pre-trained or fixed layer with no trainable parameters) inside a custom loss function? 是否可以在自定义损失函数中使用Keras层(没有可训练参数的预训练层或固定层)?
I would like to do something like: 我想做类似的事情:
def custom_loss(y_true, y_pred):
y_true_trans = SomeKerasLayer()(y_true)
y_true_trans = SomeKerasLayer()(y_pred)
return K.mean(K.abs(y_pred_trans - y_true_trans), axis=-1)
In the Tensorflow backend, I get the error: 在Tensorflow后端中,出现错误:
File "/home/drb/venvs/keras/lib/python3.5/site-packages/tensorflow/python /framework/tensor_util.py", line 364, in make_tensor_proto
在make_tensor_proto中的第364行中,文件“ /home/drb/venvs/keras/lib/python3.5/site-packages/tensorflow/python /framework/tensor_util.py”
raise ValueError("None values not supported.")
引发ValueError(“不支持任何值。”)
ValueError: None values not supported.
ValueError:不支持任何值。
Of course I could transform y_pred
with the Keras layer outside the loss function (by providing an extra output), but I can't do the same with the reference value y_true
. 当然,我可以使用Loss函数之外的
y_pred
层对y_pred
进行转换(通过提供额外的输出),但是我不能对参考值y_true
进行相同处理。
Another way to rephrase the same question in more general terms would be: Is it possible to encapsulate a Keras layer as a Keras backend function? 用更笼统的术语表述相同问题的另一种方法是:是否可以将Keras层封装为Keras后端函数?
Is there any solution or workaround? 有什么解决方案或解决方法吗?
The question is kind of vague, so it has both a yes and no response. 这个问题有点含糊不清,因此它既是肯定的又是否定的。
Depending on your implementation you may try 根据您的实现,您可以尝试
model = keras.layers.Add(..something..)(x)
where x = the name of the previous relevant value. 其中x =先前相关值的名称。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.