简体   繁体   English

重写 Tensorflow 2.0 中的 tf.contrib.layers.batch_norm

[英]Rewrite tf.contrib.layers.batch_norm in Tensorflow 2.0

Could somebody help me rewrite the following block of code in Tf2.0?有人可以帮我在 Tf2.0 中重写以下代码块吗?
I'm aware batch_norm is equivalent to keras.layers.BatchNormalization but the documentation doesn't give clear solution as to what 'decay' and 'epsilon' correspond to.我知道 batch_norm 相当于 keras.layers.BatchNormalization 但文档没有给出关于“衰减”和“epsilon”对应的明确解决方案。 Thanks!谢谢!

def batch_norm(opts, _input, is_train, reuse, scope, scale=True):
    """Batch normalization based on tf.contrib.layers.

    """
    return tf.contrib.layers.batch_norm(
        _input, center=True, scale=scale,
        epsilon=opts['batch_norm_eps'], decay=opts['batch_norm_decay'],
        is_training=is_train, reuse=True, updates_collections=None,
        scope=scope, fused=False)

In this case decay corresponds to momentum of tf.keras.layers.BatchNormalization and epsilon is still epsilon .在这种情况下, decay对应于tf.keras.layers.BatchNormalizationmomentum ,而epsilon仍然是epsilon

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM