简体   繁体   English

使用 keras 的 L2 归一化输出

[英]L2 normalised output with keras

I would like to build a neural net with Keras with Tensorflow backend which outputs an L2 normalized vector.我想用带有 Tensorflow 后端的 Keras 构建一个神经网络,它输出一个 L2 归一化向量。 I have tried the following but for some reason it does not normalize the output:我尝试了以下方法,但由于某种原因它没有标准化输出:

import keras.backend as K
input = Input(shape=input_shape)
...
dense7 = Dense(output_dim=3)(flatten6)
l2_norm = Lambda(lambda  x: K.l2_normalize(x,axis=1))(dense7)
return Model(input=input, output=l2_norm)

I found the problem!我发现了问题!

So I am using tensorflow as a backed and K.l2_normalize(x, axis) calls the tf.nn.l2_normalize(x, dim, epsilon=1e-12, name=None).所以我使用 tensorflow 作为支持和 K.l2_normalize(x,axis) 调用 tf.nn.l2_normalize(x,dim,epsilon=1e-12,name=None)。 Notice that this method has one extra parameter epsilon.请注意,此方法有一个额外的参数 epsilon。 And this method looks as follows:此方法如下所示:

with ops.name_scope(name, "l2_normalize", [x]) as name:
   x = ops.convert_to_tensor(x, name="x")
   square_sum = math_ops.reduce_sum(math_ops.square(x), dim, keep_dims=True)
   x_inv_norm = math_ops.rsqrt(math_ops.maximum(square_sum, epsilon))
return math_ops.mul(x, x_inv_norm, name=name)

So if the output of the net contains numbers lower then epsilon (which is set to 1e-12 by default) then it is not normalized correctly which is what happens in my case.因此,如果网络的输出包含低于 epsilon 的数字(默认设置为 1e-12),那么它没有正确标准化,这就是我的情况。

You can use the function, which is called by tensorflow.keras.backend.l2_normalize to set the epsilon value:您可以使用由 tensorflow.keras.backend.l2_normalize 调用的函数来设置 epsilon 值:

from tensorflow.python.ops import nn
nn.l2_normalize(x, axis=None, epsilon=1e-12) 

@thebeancounter You can define your own L2 Layer. @thebeancounter 您可以定义自己的 L2 层。 For example to support masking: if there follow layers after the L2 Normalization, which depend on masking you should use the following:例如支持遮罩:如果在 L2 归一化之后有后续层,这取决于遮罩,您应该使用以下内容:

class L2Layer(tf.keras.layers.Layer):
    def __init__(self):
        super(L2Layer, self).__init__()
        self.supports_masking = True

    def call(self, inputs, mask=None):
        return K.l2_normalize(inputs, axis=2)

我认为您可以使用像这样的最后一层:

tf.keras.layers.Lambda(lambda x: tf.math.l2_normalize(x, axis=1))

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM