简体   繁体   中英

Add L2 Regularization to Tensorflow contrib.learn.Estimator

I want to add L2 regularization to a custom contrib.learn estimator and I can't figure out how to do it easily.

Is there a way to add L2 regularization to the existing Estimators (eg the DNNClassfier) that I have overlooked?

The only way I could think of adding the L2 norm to my custom estimator is to write a new head with and altered cost function. But I guess there is an easier and more elegant solution to this common problem. Did anybody had the same issue?

EDIT: I guess I found a solution. I can use the gradient_clip_norm to clip the gradients. That way the gradients should be limited by the global L2 Norm and essentially I have L2 regularization. Is my thinking correct?

AFAIK, you cannot incorporate l1 or l2 regularizations on provided estimators, like the subclasses of tf.estimator.Estimator and tf.contrib.learn.Estimator . Nonetheless, you could create customized estimator s by using tf.layers api, explained here: https://www.tensorflow.org/extend/estimators . With customized estimator you could apply the regularizers. Please refer to this answer: https://stackoverflow.com/a/44238354/4206988 for regularizing weights using tf.layers api. Functions like tf.layers.dense() has a kernel_regularizer field where you can regularize the weight matrix.

And please note that L2 regularization is not the same thing as gradient norm clipping. In norm clipping, there is a specific limit to the norm of the parameter of interest, while in L2 regularization, there is no such limit, it is a soft constraint.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM