简体   繁体   English

如何在Keras中实现CRelu?

[英]How to implement CRelu in Keras?

I'm trying to implement CRelu layer in Keras 我正在尝试在Keras中实现CRelu层

One option that seems work is to use Lambda layer: 似乎可行的一种选择是使用Lambda层:

def _crelu(x):
    x = tf.nn.crelu(x, axis=-1)
    return x

def _conv_bn_crelu(x, n_filters, kernel_size):
    x = Conv2D(filters=n_filters, kernel_size=kernel_size, strides=(1, 1), padding='same')(x)
    x = BatchNormalization(axis=-1)(x)
    x = Lambda(_crelu)(x)
    return x

But I wonder is Lamda layer introduce some overhead in training or inference process? 但是我想知道Lamda层是否在训练或推理过程中引入了一些开销?

My second attemp is to create keras layer that is wrapper around tf.nn.crelu 我的第二个尝试是创建环绕tf.nn.crelu keras层

class CRelu(Layer):
    def __init__(self, **kwargs):
        super(CRelu, self).__init__(**kwargs)

    def build(self, input_shape):
        super(CRelu, self).build(input_shape)

    def call(self, x):
        x = tf.nn.crelu(x, axis=-1)
        return x

    def compute_output_shape(self, input_shape):
        output_shape = list(input_shape)
        output_shape[-1] = output_shape[-1] * 2
        output_shape = tuple(output_shape)
        return output_shape

def _conv_bn_crelu(x, n_filters, kernel_size):
    x = Conv2D(filters=n_filters, kernel_size=kernel_size, strides=(1, 1), padding='same')(x)
    x = BatchNormalization(axis=-1)(x)
    x = CRelu()(x)
    return x

Which version will be more efficient? 哪个版本会更有效?

Also looking forward for pure Keras implementation, if it's possible. 如果可能的话,还期待纯粹的Keras实现。

I don't think there is a significant difference between the two implementations speed-wise. 我认为这两种实现在速度方面没有明显的区别。

The Lambda implementation is the simplest actually but writing a custom Layer as you have done usually is better, especially for what regards model saving and loading ( get_config method). Lambda实现实际上是最简单的,但通常编写一个自定义图层会更好,特别是对于模型保存和加载( get_config方法)而言。

But in this case it doesn't matter as the CReLU is trivial and don't require saving and restoring parameters. 但是在这种情况下,这并不重要,因为CReLU是微不足道的,并且不需要保存和还原参数。 You can store the axis parameter actually as in the code below. 您实际上可以按照以下代码存储axis参数。 In this way it will be retrieved automatically when the model is loaded. 这样,将在加载模型时自动检索它。

class CRelu(Layer):
    def __init__(self, axis=-1, **kwargs):
        self.axis = axis 
        super(CRelu, self).__init__(**kwargs)

    def build(self, input_shape):
        super(CRelu, self).build(input_shape)

    def call(self, x):
        x = tf.nn.crelu(x, axis=self.axis)
        return x

    def compute_output_shape(self, input_shape):
        output_shape = list(input_shape)
        output_shape[-1] = output_shape[-1] * 2
        output_shape = tuple(output_shape)
        return output_shape

    def get_config(self, input_shape):
        config = {'axis': self.axis, }
        base_config = super(CReLU, self).get_config()
        return dict(list(base_config.items()) + list(config.items()))

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM