简体   繁体   中英

tf.layers.dense kernel initializer and regularizer

The tf.layers.dense function defined as:

tf.layers.dense(
    inputs,
    units,
    activation=None,
    use_bias=True,
    kernel_initializer=None,
    bias_initializer=tf.zeros_initializer(),
    kernel_regularizer=None,
    bias_regularizer=None,
    activity_regularizer=None,
    trainable=True,
    name=None,
    reuse=None
)

has two optional arguments kernel_initializer and kernel_regularizer . I have two different regularization and initialization techniques of my own that I wish to experiment with. I am not keen on implementing the entire neural network from scratch. Could someone provide an example for supplying custom functions to these two arguments?

The best thing to do is to check the implementation of initializer and regularizer in tensorflow. For instance, the variance_scaling_initializer initializer is defined in this code: https://github.com/tensorflow/tensorflow/blob/r1.3/tensorflow/contrib/layers/python/layers/initializers.py#L62-L152

It is consituted of an initializer function with the following signature:

initializer(shape, dtype=dtype, partition_info=None)

that returns a tensor.

The regularizers are defined here: https://github.com/tensorflow/tensorflow/blob/r1.3/tensorflow/contrib/layers/python/layers/regularizers.py

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM