简体   繁体   English

如何重用“复合” Keras层?

[英]How can I reuse a “composite” Keras layer?

So, I have this small helper function: 所以,我有这个小助手功能:

def ResConv(input, size):
    return BatchNormalization()(Add()([
        GLU()(Conv1D(size*2, 5, padding='causal',)(input)),
        input
    ]))

It creates a specific sequence of layers to be used together; 它创建特定的层序列以一起使用; it's pretty clear. 很清楚

However, now I realize that I need to reuse the same layer on different inputs; 但是,现在我意识到我需要在不同的输入上重用同一层。 that is, I need to have something like this 也就是说,我需要这样的东西

my_res_conv = ResConv(100)
layer_a = my_res_conv(input_a)
layer_b = my_res_conv(input_b)
concat = concatenate([layer_a, layer_b])

and have layer_a and layer_b share weights. 并让layer_alayer_b共享权重。

How can I do this? 我怎样才能做到这一点? Do I have to write a custom layer? 我必须编写自定义图层吗? I never did it before, and I'm not sure on how to approach this situation. 我以前从未做过,而且我不确定如何处理这种情况。

I ended up actually making a custom class like this: 我最终实际上制作了一个自定义类,如下所示:

class ResConv():
    def __init__(self, size):
        self.conv = Conv1D(size*2, 5, padding='causal')
        self.batchnorm = BatchNormalization()
        super(ResConv, self).__init__()

    def __call__(self, inputs):
        return self.batchnorm(Add()([
            GLU()(self.conv(inputs)),
            inputs
        ]))

Basically, you initialize your layers in the __init__ , and write the whole computation sequence in __call__ ; 基本上,您可以在__init__初始化层,然后将整个计算序列写入__call__ this way your class reapplies the same layers to new inputs every time you call it. 这样,您的班级每次调用时都会将相同的图层重新应用于新输入。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM