简体   繁体   English

批量归一化层和演化归一化激活层有什么用

[英]What is the use of Batch Normalization Layer and Evolving normalization activation layers

When to decide that we need a batch or evolving layer and how do we decide it?什么时候决定我们需要一个批处理层或演化层,我们如何决定它? I am currently using PyTorch and I want to understand how I can decide which layer to add?我目前正在使用 PyTorch,我想了解如何决定添加哪一层?

General answer here is to try all of them and select the one, which performs best on the validation.这里的一般答案是尝试所有这些,select 在验证中表现最好。

As for EvoNorm from this paper , it depends on your problem.至于本文中的 EvoNorm,这取决于您的问题。 Authors tested the new layer on classification problem with limited set of models.作者用有限的模型集测试了分类问题的新层。 For image synthesis results weren't as good as for classification.对于图像合成结果不如分类好。

In my opinion, batchnorm is a good starting point to construct baseline solution, because it is time tested, and then try more advanced things.在我看来,batchnorm 是构建基线解决方案的一个很好的起点,因为它经过时间测试,然后尝试更高级的东西。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 不能在卷积层中同时使用偏差和批量归一化 - Can not use both bias and batch normalization in convolution layers 如何在TensorFlow中使用官方批量标准化层? - How does one use the official Batch Normalization layer in TensorFlow? 为什么层batch_normalization_6与层不兼容? - Why layer batch_normalization_6 is incompatible with the layer? tf.layers.batch_normalization参数 - tf.layers.batch_normalization parameters 为什么在下一层为relu时禁用tf.layers.batch_normalization的参数“ scale”? - why the parameter 'scale' of tf.layers.batch_normalization is disabled when next layer is relu? Keras批处理规范化-哪个轴? - Keras batch normalization - what axis? 在 tf.layers.batch_normalization(training=?) 中使用 training=True 与否有什么区别 - what is the difference between using training=True or not in tf.layers.batch_normalization(training=?) 如何在tensorflow 1.12中使用层归一化? - How to use layer normalization in tensorflow 1.12? Python Keras 层batch_normalization的输入0与层不兼容 - Python Keras Input 0 of layer batch_normalization is incompatible with the layer 关于 DNN model 中 Dropout 层和 Batch Normalization 层的问题 - Question About Dropout Layer and Batch Normalization Layer in DNN model
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM