简体   繁体   English

Keras 中的正则化层不起作用

[英]Regularization layer in Keras does' not effect

I try to write a simple custom layer for regularization in Keras as follow:我尝试在 Keras 中编写一个简单的自定义层进行正则化,如下所示:

from tensorflow.keras.regularizers import Regularizer
import tensorflow as tf

class MyRegularization(Regularizer):
      def __init__(self):
          self.alpha = 100000000

      def __call__(self, w):
          return self.alpha * tf.reduce_sum(w ** 2)

as you see the coefficent of regularization is so great.如您所见,正则化的系数非常大。 Then, I add this regularization to each layer of a simple network:然后,我将这个正则化添加到一个简单网络的每一层:

model = Sequential([
      Dense(1000, activation='relu', input_shape=[10,], kernel_regularizer=MyRegularization()),
      Dense(100, activation='relu', kernel_regularizer=MyRegularization()),
      Dense(10, activation='relu', kernel_regularizer=MyRegularization()),
      Dense(2, activation='softmax', kernel_regularizer=MyRegularization()),
])

I expect that because of a great alpha the learning is not converge, but it is look like the value of alpha does not affect the training procdeure.我希望由于alpha很大,学习不会收敛,但看起来alpha的值不会影响训练过程。 Why?为什么?

I took the alpha = 1e-4 but the there is not different when alpha=1e+10 .我拿了alpha = 1e-4但是当alpha=1e+10时没有什么不同。 :-\ :-\

Are you calling the the class instance您是在调用 class 实例吗

MyRegularization()

or are you calling an object instance of the class with the same class name?还是您正在调用具有相同 class 名称的 class 的 object 实例?

MyRegularization = MyRegularization()
...
MyRegularization()

because only the latter one will define the function body ( __call__ ) that's carried out when an instance is used as a function.因为只有后一个将定义 function 主体 ( __call__ ),该主体在将实例用作 function 时执行。 That would explain why the regularization value are not returning, thus looking like the values of alpha are not affecting the training.这可以解释为什么正则化值没有返回,因此看起来alpha的值没有影响训练。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM