简体   繁体   中英

Unable to infer results using tflite object detection model

I successfully converted a ssd_mobilenet_v3 model into a.tflite by retraining the model. (previously trained on coco dataset)

But while using this model for object detection to run inference on a single image using the following code:

interpreter = tf.lite.Interpreter(model_path)

I am getting the following error:

ValueError                                Traceback (most recent call last)
<ipython-input-15-e1c9008b610c> in <module>
----> 1 interpreter = tf.lite.Interpreter("/home/sushanth/Documents      /nuts_poc/tflite_od/nam_model_quantized.lite")
  2 interpreter.allocate_tensors()
  3 input_details = interpreter.get_input_details()
  4 output_details = interpreter.get_output_details()
  5 input_tensor_index = interpreter.get_input_details()[0]["index"]

~/.local/lib/python3.7/site-packages/tensorflow/lite/python/interpreter.py   in __init__(self, model_path, model_content)
 75       self._interpreter = (
 76             _interpreter_wrapper.InterpreterWrapper_CreateWrapperCPPFromFile(
---> 77               model_path))
 78       if not self._interpreter:
 79         raise ValueError('Failed to open {}'.format(model_path))

ValueError: Op builtin_code out of range: 117. Are you using old TFLite binary with newer model?Registration failed.

Kindly explain the error and a possible solution.

Tensorflow Version: 1.1.4

OS: Ubuntu 18.04

Python: 3.7

PS: I converted a classifier model (inception_v2) into tflite and used the above code ("interpreter = tf.lite.Interpreter(model_path)") without any error!

Update tensorflow version to >=2.0.0

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM