[英]Batch Normalization Quantize Tensorflow 1.x does not have MinMax information
A layer (....) which is an input to the Conv operator producing the output array model/re_lu_1/Relu, is lacking min/max data, which is necessary for quantization.作为生成 output 数组模型/re_lu_1/Relu 的 Conv 运算符的输入的层 (....) 缺少量化所必需的最小/最大数据。 If accuracy matters, either target a non-quantized output format, or run quantized training with your model from a floating point checkpoint to change the input graph to contain min/max information.如果精度很重要,要么以非量化 output 格式为目标,要么从浮点检查点使用 model 运行量化训练,以更改输入图以包含最小/最大信息。 If you don't care about accuracy, you can pass --default_ranges_min= and --default_ranges_max= for easy experimentation.如果您不关心准确性,您可以传递 --default_ranges_min= 和 --default_ranges_max= 以便于实验。
For tensorflow 1.x, if you want to quantize, you have to place it with fake quantization nodes to activate the quantization of the model. There are 3 phases of quantization:对于tensorflow 1.x,如果你想量化,你必须在它上面放置伪量化节点来激活model的量化。量化分为3个阶段:
However, the most important factor is the configuration of batch_normalization in the model. After trying multiple configuration, the best one is using batch_normalization without fused option from tensorflow.keras.layers
.然而,最重要的因素是 model 中 batch_normalization 的配置。尝试多种配置后,最好的配置是使用 batch_normalization without fused option from tensorflow.keras.layers
。 The reason is because Tensorflow want to avoid the folding result to be quantized.原因是因为Tensorflow要避免折叠结果被量化。 Therefore, activation behind batchnorm wont work.因此,batchnorm 后面的激活将不起作用。 Details in [here][1] [此处][1] 中的详细信息
In short, this layer should be attached only under tensorflow.keras.layers.Conv2D
with parsed activation param, which is Relu/Relu6/Identity简而言之,这一层应该只附加在tensorflow.keras.layers.Conv2D
下,具有解析的激活参数,即 Relu/Relu6/Identity
If you conduct the above process: Conv2d=>Activation=>BatchNorm如果进行上述过程:Conv2d=>Activation=>BatchNorm
the layer will not yield errors does not have MinMax information
该层不会产生错误does not have MinMax information
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.