简体   繁体   English

如何将冻结图转换为 TensorFlow lite

[英]How to convert frozen graph to TensorFlow lite

I have been trying to follow, https://www.tensorflow.org/lite/examples/object_detection/overview#model_customization all day to convert any of the tensorflow Zoo models to a TensorFlow lite model for running on Android with no luck. I have been trying to follow, https://www.tensorflow.org/lite/examples/object_detection/overview#model_customization all day to convert any of the tensorflow Zoo models to a TensorFlow lite model for running on Android with no luck.

I downloaded several of the models from here, https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/tf1_detection_zoo.md (FYI, Chrome does not let you down these links as not https, I had to right click Inspect the link and click on the link in inspector)我从这里下载了几个模型, https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/tf1_detection_zoo.md (仅供参考,Chrome不会让你失望这些链接不是Z5E056C500A1C4B6A7110B50D807BADE5必须右键单击检查链接并单击检查器中的链接)

I have the script,我有剧本,

import tensorflow as tf

converter = tf.lite.TFLiteConverter.from_frozen_graph(
    graph_def_file='frozen_graph.pb',
    input_shapes = {'normalized_input_image_tensor':[1,300,300,3]},
    input_arrays = ['normalized_input_image_tensor'],
    output_arrays = ['TFLite_Detection_PostProcess','TFLite_Detection_PostProcess:1', 'TFLite_Detection_PostProcess:2','TFLite_Detection_PostProcess:3']
)
tflite_model = converter.convert()

with open('model.tflite', 'wb') as f:
  f.write(tflite_model)

but gives the error, ValueError: Invalid tensors 'normalized_input_image_tensor' were found但给出了错误,ValueError: Invalid tensors 'normalized_input_image_tensor' were found

so the lines, input_shapes = {'normalized_input_image_tensor':[1,300,300,3]}, input_arrays = ['normalized_input_image_tensor'], output_arrays = ['TFLite_Detection_PostProcess','TFLite_Detection_PostProcess:1', 'TFLite_Detection_PostProcess:2','TFLite_Detection_PostProcess:3']所以行, input_shapes = {'normalized_input_image_tensor':[1,300,300,3]}, input_arrays = ['normalized_input_image_tensor'], output_arrays = ['TFLite_Detection_PostProcess','TFLite_Detection_PostProcess:1', 'TFLite_Detection_PostProcess:2','TFLite_Detection_PostProcess:3' ]

must be wrong, need a different shape, but how do I get this for each of the zoo models, or is there some preconvert code I need to run first?一定是错的,需要不同的形状,但是我如何为每个动物园模型得到这个,或者我需要先运行一些预转换代码吗?

Running the "code snipet" below I get,运行下面我得到的“代码片段”,

--------------------------------------------------
Frozen model layers:
name: "add/y"
op: "Const"
attr {
  key: "dtype"
  value {
    type: DT_FLOAT
  }
}
attr {
  key: "value"
  value {
    tensor {
      dtype: DT_FLOAT
      tensor_shape {
      }
      float_val: 1.0
    }
  }
}

Input layer:  add/y
Output layer:  Postprocessor/BatchMultiClassNonMaxSuppression/map/while/NextIteration_1
--------------------------------------------------

But I don't see how this would map to the input_shape or help with the conversion??但我不明白这将如何将 map 转换为 input_shape 或帮助转换?

Is it even possible to convert models like faster_rcnn_inception_v2_coco to tflite?甚至可以将诸如faster_rcnn_inception_v2_coco之类的模型转换为tflite吗? I read somewhere that only ssd models are supported?我在某处读到只支持 ssd 模型?

This code snippet此代码段

import tensorflow as tf

def print_layers(graph_def):
    def _imports_graph_def():
        tf.compat.v1.import_graph_def(graph_def, name="")

    wrapped_import = tf.compat.v1.wrap_function(_imports_graph_def, [])
    import_graph = wrapped_import.graph

    print("-" * 50)
    print("Frozen model layers: ")
    layers = [op.name for op in import_graph.get_operations()]
    ops = import_graph.get_operations()
    print(ops[0])
    print("Input layer: ", layers[0])
    print("Output layer: ", layers[-1])
    print("-" * 50)

# Load frozen graph using TensorFlow 1.x functions
with tf.io.gfile.GFile("model.pb", "rb") as f:
    graph_def = tf.compat.v1.GraphDef()
    loaded = graph_def.ParseFromString(f.read())

frozen_func = print_layers(graph_def=graph_def)

prints the attributes, including the shape, of the input layer, along with the names of input and output layers:打印输入层的属性,包括形状,以及输入层和 output 层的名称:

--------------------------------------------------
Frozen model layers: 
name: "image_tensor"
op: "Placeholder"
attr {
  key: "dtype"
  value {
    type: DT_UINT8
  }
}
attr {
  key: "shape"
  value {
    shape {
      dim {
        size: -1
      }
      dim {
        size: -1
      }
      dim {
        size: -1
      }
      dim {
        size: 3
      }
    }
  }
}

Input layer:  image_tensor
Output layer:  detection_classes
--------------------------------------------------

I think this article can help you我认为这篇文章可以帮助你

Those models were made using TensorFlow version 1. so you have to use the saved_model to generate a concrete function (because TFLite doesn't like dynamic input shapes), and from there convert to TFLite.这些模型是使用 TensorFlow 版本 1 制作的。因此您必须使用 saved_model 生成具体的 function(因为 TFLite 不喜欢动态输入形状),然后从那里转换为 TFLite。

I will write down a simple solution that you can use immediately.我会写一个简单的解决方案,你可以立即使用。

Open a colab notebook, it is free and online.打开一个 colab notebook,它是免费的并且是在线的。 Go to this address and click on New Notebook at right down. Go 到这个地址,然后点击右下角的 New Notebook。

First cell (input below and execute with play button):第一个单元格(在下面输入并使用播放按钮执行):

!wget http://download.tensorflow.org/models/object_detection/ssd_mobilenet_v1_fpn_shared_box_predictor_640x640_coco14_sync_2018_07_03.tar.gz
!tar -xzvf "/content/ssd_mobilenet_v1_fpn_shared_box_predictor_640x640_coco14_sync_2018_07_03.tar.gz" -C "/content/"

Second cell (input,execute):第二个单元格(输入,执行):

import tensorflow as tf
print(tf.__version__)

Third cell (input, execute):第三个单元格(输入,执行):

model = tf.saved_model.load('/content/ssd_mobilenet_v1_fpn_shared_box_predictor_640x640_coco14_sync_2018_07_03/saved_model')
concrete_func = model.signatures[
tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY]
concrete_func.inputs[0].set_shape([1, 300, 300, 3])

converter = tf.lite.TFLiteConverter.from_concrete_functions([concrete_func])
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]

tflite_model = converter.convert()

with open('detect.tflite', 'wb') as f:
  f.write(tflite_model)

The code below is necessary because there are some ops that are not supported natively by TFLite:下面的代码是必要的,因为 TFLite 本身不支持一些操作:

converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS] converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]

but you have to add the specific dependency also at the mobile project following this .但是您还必须在之后的移动项目中添加特定的依赖项。

If you want to shed some MB of the tflite file and make it smaller follow these procedures .如果您想减少一些 MB 的 tflite 文件并使其更小,请遵循 这些程序

After completion you will see at the left side a detect.tflite model.完成后,您将在左侧看到一个detect.tflite model。

Go to netron.app and copy paste the file or browse to upload it. Go 到netron.app并复制粘贴文件或浏览上传。 You will see all the details:您将看到所有详细信息: 在此处输入图像描述

If I have forgotten something or you need something more ping me.如果我忘记了什么或者你需要更多的东西给我打电话。

Happy coding.快乐编码。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何将在 tensorflow 2 中训练的模型转换为 tensorflow 1 冻结图 - How can I convert a model trained in tensorflow 2 to a tensorflow 1 frozen graph TensorFlow:有没有办法将冻结图转换为检查点模型? - TensorFlow: Is there a way to convert a frozen graph into a checkpoint model? 无法将 tensorflow 冻结图转换为 pbtxt 文件 - Failed to convert tensorflow frozen graph to pbtxt file 如何修复“ TOCO失败。 检查失败:将冻结的图形转换为tensorflow_lite模型时,dim> = 1(0对1)”错误 - How to fix “TOCO failed. Check failed: dim >= 1 (0 vs. 1)” error while converting a frozen graph into a tensorflow_lite model 如何将Tensorflow Simple Audio Recognition冷冻图(.pb)转换为Core ML模型? - How to convert Tensorflow Simple Audio Recognition frozen graph(.pb) to Core ML model? 如何将预训练的 tensorflow pb 冻结图转换为可修改的 h5 keras 模型? - How to convert a pretrained tensorflow pb frozen graph into a modifiable h5 keras model? Tensorflow:如何将冻结模型转换为已保存模型 - Tensorflow: how to convert a frozen model to saved model 将 Tensorflow 模型转换为 Tensorflow Lite - Convert Tensorflow model into Tensorflow Lite 如何将 tensorflow 2.0 估计器 model 转换为 tensorflow lite? - How do i convert tensorflow 2.0 estimator model to tensorflow lite? 在冻结图Tensorflow中替换Concat Op - Replace Concat Op in Frozen Graph Tensorflow
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM