简体   繁体   English

从相同检查点提取的经过重新训练的Tflite / Pb模型给出不同的结果

[英]Retrained Tflite/Pb models extracted from the same checkpoint give different results

After I retrained pre-trained ssd mobilenet v1 model with my own image dataset using object_detection\\model_main.py script, I exported both .pb freeze graph (with export_inference_graph.py script) 使用object_detection \\ model_main.py脚本用自己的图像数据集对经过预训练的ssd mobilenet v1模型进行了培训之后,我导出了两个.pb冻结图(使用export_inference_graph.py脚本)

python models\research\object_detection\export_inference_graph.py 
--input_type image_tensor 
--input_shape=1,300,300,3 
--pipeline_config_path ssd_mobilenet_v1_test.config 
--trained_checkpoint_prefix training/model.ckpt 
--output_directory export\freeze\

and .tflite graph (with export_tflite_ssd_graph.py script and tflite_convert). 和.tflite图(使用export_tflite_ssd_graph.py脚本和tflite_convert)。

python models\research\object_detection\export_tflite_ssd_graph.py 
--input_type image_tensor 
--pipeline_config_path ssd_mobilenet_v1_test.config 
--trained_checkpoint_prefix training/model.ckpt 
--output_directory export\tflite\ 
--max_detections 16 
--add_postprocessing_op=true

tflite_convert 
--output_file=export\tflite\model.tflite 
--graph_def_file=export\tflite\tflite_graph.pb 
--input_shapes=1,300,300,3 
--input_arrays=normalized_input_image_tensor 
--output_arrays=TFLite_Detection_PostProcess,TFLite_Detection_PostProcess:1,TFLite_Detection_PostProcess:2,TFLite_Detection_PostProcess:3 
--inference_type=QUANTIZED_UINT8 
--mean_values=128 
--std_dev_values=128 
--default_ranges_min=0 
--default_ranges_max=6 
--allow_custom_ops

Pb graph seems to work just fine, but the tflite one false detect everything on android, so I get 16 out of 16 possible detections whatever image I pass to it, even image filled with black colour (I test it on android device. It works well with pre-trained model). PB图似乎工作正常,但是tflite一个false可以检测到android上的所有内容,因此无论我传递给它的图像是什至是黑色的图像(我在android设备上对其进行测试,我都能从16种可能的检测中获得16种)。良好的预训练模型)。

Changing convert options like disabling/enabling quantizing, image std/mean didn't change anything. 更改转换选项,如禁用/启用量化,image std / mean并没有改变任何东西。 I also compared my tflite graph to example mobilenet graph and they look pretty similar. 我还将tflite图与示例mobilenet图进行了比较,它们看起来非常相似。 Any ideas what can cause that problem? 有什么想法会导致该问题吗?

(windows 10/cuda 9.0/cudnn 7.0/tf-nightly-gpu/models-master) (windows 10 / cuda 9.0 / cudnn 7.0 / tf-nightly-gpu / models-master)

The output tensors from the tflite model appear to return some extreme values (ex: 5e35 or -3e34). 来自tflite模型的输出张量似乎返回一些极值(例如:5e35或-3e34)。 Since some of these score values are greater than 1, it counts as a detection. 由于这些得分值中的一些大于1,因此它被视为检测。

My solution, replace all values greater than a limit (I did 1e5) with 0. (Python was faster.) 我的解决方案是,将大于限制的所有值(我执行1e5)都替换为0。(Python更快。)

tensor[tensor > 1e5] = 0

It is weird this doesn't happen with the example detector.tflite or an exported frozen inference graph. 奇怪的是,在示例detector.tflite或导出的冻结推理图中都没有发生这种情况。 There must be a proper way for exporting tflite models. 必须有导出tflite模型的正确方法。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 2 相同的 HTTP 请求给出不同的结果? - 2 Same HTTP Requests Give Different Results? 为什么 super.hashCode 对来自同一个 Class 的对象给出不同的结果? - Why does super.hashCode give different results on objects from the same Class? Java Random对象可以在同一种子的不同执行中给出不同的结果吗? - Can a Java Random object give different results in different executions for the same seed? 来自同一 TDB 数据集的不同命名模型上的 ConcurrentModificationException - ConcurrentModificationException on different named Models from the same TDB dataset 用spring-data从同组表中查询不同模型 - Query different models from the same set of tables with spring-data Files.exists(path)和path.toFile()。exists()为同一文件提供不同的结果 - Files.exists(path) and path.toFile().exists() give different results for the same file 不同的Maven版本提供不同的Pmd结果 - Different Maven versions give different pmd results 对png和bmp进行按位运算会得到不同的结果? (相同的32位ARGB表示形式) - Bitwise operations on a png and bmp give different results? (Same 32 bit ARGB representation) TfLite Model 在 Android 应用程序和 Z23EEEB4347BDD755DDBZA 中给出不同的 output。 对于大多数输入 tflite model 在 android 上给出相同的 output。 为什么? 请修复 - TfLite Model is giving different output on Android app and in python . For most inputs tflite model gives same output on android . Why? Please fix 如何显示提取数据的第一行结果? - How to display the first row of results from extracted data?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM