简体   繁体   English

如何使用另一个模型的输出作为参数来实现自定义Keras稳压器?

[英]How to implement custom Keras regulizer with output of another model as a parameter?

I'm trying to replicate this article: https://arxiv.org/pdf/1705.08302.pdf 我正在尝试复制这篇文章: https : //arxiv.org/pdf/1705.08302.pdf

Basically, a fully convolutional network (FCN) does voxel level predictions on a patch of a image, then, this patch and its respective labels are passed through an autoencoder and then compared to evaluate the "global shape" of the predictions. 基本上,全卷积网络(FCN)对图像的小块进行体素级别预测,然后将该小块及其相应的标签传递给自动编码器,然后进行比较以评估预测的“整体形状”。

So the loss function (eq. (1) page 4) is a linear combination between the cross entropy from the FCN and euclidean distance loss from the autoencoder. 因此,损失函数(第(4)页的等式(1))是FCN的交叉熵和自编码器的欧式距离损失之间的线性组合。

Problem: 问题:

I have a working FCN and a working autoencoder, my problem has been implementing this loss function in Keras/tensorflow. 我有一个工作的FCN和一个工作的自动编码器,我的问题是在Keras / tensorflow中实现此损失功能。 So, how can I do that? 那么,我该怎么做呢?

This is what I tried so far (without third term of equation) but gives wrong results: 这是我到目前为止(没有方程的第三项)尝试过的结果,但是给出了错误的结果:

def euclidean_distance_loss(y_true, y_pred):
    from keras import backend as K
return K.sqrt(K.sum(K.square(y_pred - y_true)))

def ACNN_loss(l1, autoencoder):
    from keras import backend as K

    def loss(y_true, y_pred):
        ae_seg = autoencoder(y_pred)
        ae_gt = autoencoder(y_true)

        Lhe = K.sqrt(K.sum(K.square(ae_seg - ae_gt)))

        Lx = K.binary_crossentropy(y_true, y_pred)

        return Lx + (l1 * Lhe)

return loss

l1 = 0.01

ae_path = #path of my autoencoder model and its weights
autoencoder = keras.models.load_model(os.path.join(ae_path,'model.h5'), custom_objects={'euclidean_distance_loss': euclidean_distance_loss})

autoencoder.load_weights(os.path.join(ae_path,'weigths.h5'))


model.compile(loss = ACNN_loss(l1, autoencoder),
              optimizer = keras.optimizers.Adam(lr=0.0003, beta_1=0.9, beta_2=0.999, epsilon=1e-08, decay=0.0),
              metrics= ['accuracy', keras.metrics.binary_crossentropy]
              )

This is my first question so sorry if I messed up on any requirements. 这是我的第一个问题,对不起任何要求,对不起。 Thanks in advance 提前致谢

The square- root is unnecessary. 平方根是不必要的。 Looking on the paper you attached, the loss doesn't contains a sqrt function. 在您所附的纸上,损失不包含sqrt函数。 Ie the regularization term takes the squared-distance, instead of the norm-distance 即正则化项采用平方距离而不是范数距离

Specifically, you should replace 具体来说,您应该更换

Lhe = K.sqrt(K.sum(K.square(ae_seg - ae_gt)))

with

Lhe = K.sum(K.square(ae_seg - ae_gt))

In general, L2 regularization always takes the squared distance. 通常,L2正则化始终采用平方距离。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM