简体   繁体   English

如何实现去除/修剪神经网络中接近零的参数?

[英]How to achieve removing/pruning the near-zero parameters in neural network?

I need to remove the near-zero weights of the Neural network so that the distribution of parameters is far away from the zero point.我需要去除神经网络的接近零的权重,使参数的分布远离零点。 The distribution of weights after removing nearzero weights and weight-scaling去除近零权重和权重缩放后的权重分布

I met the problem from this paper: https://ieeexplore.ieee.org/document/7544366我从这篇论文中遇到了问题: https://ieeexplore.ieee.org/document/7544366

I wonder how can I achieve this in my PyTorch/TensorFlow program, such as use a customized activation layer?我想知道如何在我的 PyTorch/TensorFlow 程序中实现这一点,例如使用自定义激活层? Or Define a loss function that punishes the near-zero weight?或者定义一个损失 function 来惩罚接近零的权重?

Thank you if you can provide any help.如果你能提供任何帮助,谢谢。

You're looking for L1 regularization, read the docs .您正在寻找 L1 正则化, 请阅读文档

import tensorflow as tf

tf.keras.layers.Dense(units=128,
                      kernel_regularizer=tf.keras.regularizers.L1(.1))

Smaller coefficients will be turned to zero.较小的系数将变为零。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM