简体   繁体   中英

Tensorflow Object Detection - Convert .pb file to tflite

I try to convert a frozen SSD mobilenet v2 model to TFLITE format for android usage. Here are all my steps:

  1. I retrain with TF Object Detection API 's train.py file using the ssd_mobilenet_v2_coco_2018_03_29 model frok the model zoo. (OK)

  2. Export the trained model.ckpt to frozen model file using export_inference_graph.p y also provided by TF Object Detection API. (OK)

  3. Test the frozen graph in python with GPU and also with only CPU allowed. It works. (OK)

Here comes the downside, i try to use the following code:

import tensorflow as tf
tf.enable_eager_execution()
saved_model_dir = 'inference_graph/saved_model/'
converter = tf.contrib.lite.TFLiteConverter.from_saved_model(saved_model_dir,input_arrays=input_arrays,output_arrays=output_arrays,input_shapes={"image_tensor": [1, 832, 832, 3]})
converter.post_training_quantize = True

First I tried without adding input shapes parameter to the function, but it didn't work. Since that time I read that you can write there anything it doesn't matter.

The output till this line:

INFO:tensorflow:Saver not created because there are no variables in the graph to restore
INFO:tensorflow:The specified SavedModel has no variables; no checkpoints were restored.
INFO:tensorflow:The given SavedModel MetaGraphDef contains SignatureDefs with the following keys: {'serving_default'}
INFO:tensorflow:input tensors info: 
INFO:tensorflow:Tensor's key in saved_model's tensor_map: inputs
INFO:tensorflow: tensor name: image_tensor:0, shape: (-1, -1, -1, 3), type: DT_UINT8
INFO:tensorflow:output tensors info: 
INFO:tensorflow:Tensor's key in saved_model's tensor_map: num_detections
INFO:tensorflow: tensor name: num_detections:0, shape: (-1), type: DT_FLOAT
INFO:tensorflow:Tensor's key in saved_model's tensor_map: detection_boxes
INFO:tensorflow: tensor name: detection_boxes:0, shape: (-1, 100, 4), type: DT_FLOAT
INFO:tensorflow:Tensor's key in saved_model's tensor_map: detection_scores
INFO:tensorflow: tensor name: detection_scores:0, shape: (-1, 100), type: DT_FLOAT
INFO:tensorflow:Tensor's key in saved_model's tensor_map: detection_classes
INFO:tensorflow: tensor name: detection_classes:0, shape: (-1, 100), type: DT_FLOAT
INFO:tensorflow:Saver not created because there are no variables in the graph to restore
INFO:tensorflow:The specified SavedModel has no variables; no checkpoints were restored.
INFO:tensorflow:Froze 0 variables.
INFO:tensorflow:Converted 0 variables to const ops.

Then i wanna convert:

tflite_quantized_model = converter.convert()

This is the output:

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-6-61a136476642> in <module>
----> 1 tflite_quantized_model = converter.convert()

~/.local/lib/python3.5/site-packages/tensorflow/contrib/lite/python/lite.py in convert(self)
    451           input_tensors=self._input_tensors,
    452           output_tensors=self._output_tensors,
--> 453           **converter_kwargs)
    454     else:
    455       # Graphs without valid tensors cannot be loaded into tf.Session since they

~/.local/lib/python3.5/site-packages/tensorflow/contrib/lite/python/convert.py in toco_convert_impl(input_data, input_tensors, output_tensors, *args, **kwargs)
    340   data = toco_convert_protos(model_flags.SerializeToString(),
    341                              toco_flags.SerializeToString(),
--> 342                              input_data.SerializeToString())
    343   return data
    344 

~/.local/lib/python3.5/site-packages/tensorflow/contrib/lite/python/convert.py in toco_convert_protos(model_flags_str, toco_flags_str, input_data_str)
    133     else:
    134       raise RuntimeError("TOCO failed see console for info.\n%s\n%s\n" %
--> 135                          (stdout, stderr))
    136 
    137 

RuntimeError: TOCO failed see console for info.

I can't copy here the console output hence it's more than the 30000 character limit but here you can see it: https://pastebin.com/UyT2x2Vk

Please please help at this point, what should I do to make it work :(

My config: Ubuntu 16.04, Tensorflow-GPU 1.12

Thanks In andvance!

Had the same issue last week, resolved it by following the steps described here .

Basically the issue is that their main script does not support SSD models. I did not use bazel to do this, but the tflite_convert utility.

Careful with the export_tflite_ssd_graph.py script, read all its options before using it (mainly the --max_detections that saved my life).

Hope this helps.

Edit: Your step 2 is invalid. A saved_model cannot be converted to a tflite model if it contains a SSD. You need to export the trained model.ckpt using the export_tflite_ssd_graph.py script and use the .pb file created to convert it to tflite with the tflite_convert tool.

Your .pb file is not in right format. Here is the solution: https://github.com/peace195/tensorflow-lite-yolo-v3

We need do 2 steps:

  1. Convert .weights to SavedModel.

  2. Using tflite_convert to convert from saved model to tflite format.

Please use docker to setup environment and follow the instruction carefully.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM