简体   繁体   English

如何在Keras中对整个模型应用权重归一化?

[英]How to apply weight normalization to the whole model in Keras?

I have a huge model with lots of constitutional layers each having the W_regularizer=l2(0.01) parameter. 我有一个庞大的模型,有很多构成层,每个层都有W_regularizer = l2(0.01)参数。 I would like to remove it from each layer declaration an apply it to the model as a whole. 我想从每个图层声明中删除它,并将其作为一个整体应用于模型。 Is it possible to do this on Keras? 是不是可以在Keras上做到这一点?

Doesn't look like it from the docs 文档中看起来不像

Regularizers allow to apply penalties on layer parameters or layer activity during optimization. 规范制定者允许在优化期间对层参数或层活动应用惩罚。 These penalties are incorporated in the loss function that the network optimizes. 这些处罚包含在网络优化的损失函数中。

The penalties are applied on a per-layer basis. 罚款适用于每层。 The exact API will depend on the layer, but the layers Dense, Conv1D, Conv2D and Conv3D have a unified API. 确切的API将取决于层,但层Dense,Conv1D,Conv2D和Conv3D具有统一的API。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM