[英]Convert .pb to .tflite for a model of variable input shape
I was working on a problem where I trained a model using Tensorflow Object detection API using a custom dataset.我正在解决一个问题,我使用 Tensorflow Object 检测 ZDB974238714CA8DE634ACE 使用 custom1D11 数据集来训练 model。 I am using tf version 2.2.0我正在使用 tf 版本 2.2.0
output_directory = 'inference_graph'
!python /content/models/research/object_detection/exporter_main_v2.py \
--trained_checkpoint_dir {model_dir} \
--output_directory {output_directory} \
--pipeline_config_path {pipeline_config_path}
I was able to get a.pb file successfully along with the.ckpt file.我能够与 .ckpt 文件一起成功获取 a.pb 文件。 But now I need to convert it to.tflite.但现在我需要将其转换为.tflite。 I am unable to do so, there is some error or another.我无法这样做,有一些错误或其他。
I tried the basic way which was written on TensorFlow documentation but that didn't work either.我尝试了写在 TensorFlow 文档上的基本方法,但这也不起作用。 Another code which I tried is below:我尝试的另一个代码如下:
import tensorflow as tf
from tensorflow.keras.models import Sequential, Model
from tensorflow.keras.layers import Conv2D, Flatten, MaxPooling2D, Dense, Input, Reshape, Concatenate, GlobalAveragePooling2D, BatchNormalization, Dropout, Activation, GlobalMaxPooling2D
from tensorflow.keras.utils import Sequence
model = tf.saved_model.load(f'/content/drive/MyDrive/FINAL DNET MODEL/inference_graph/saved_model/')
converter = tf.lite.TFLiteConverter.from_keras_model(model)
converter.post_training_quantize=True
converter.inference_type=tf.uint8
tflite_model = converter.convert()
open("val_converted_model_int8.tflite", "wb").write(tflite_model)
The error I am getting is:我得到的错误是:
AttributeError Traceback (most recent call last) in () 8 converter.post_training_quantize=True 9 converter.inference_type=tf.uint8 ---> 10 tflite_model = converter.convert() 11 open("val_converted_model_int8.tflite", "wb").write(tflite_model) () 中的 AttributeError Traceback (最近一次调用最后一次) 8 converter.post_training_quantize=True 9 converter.inference_type=tf.uint8 ---> 10 tflite_model = converter.convert() 11 open("val_converted_model_int8.tflite", "wb") .write(tflite_model)
/usr/local/lib/python3.6/dist-packages/tensorflow/lite/python/lite.py in convert(self) 837 # to None. /usr/local/lib/python3.6/dist-packages/tensorflow/lite/python/lite.py in convert(self) 837 # to None. 838 # Once we have better support for dynamic shapes, we can remove this. 838 # 一旦我们对动态形状有了更好的支持,我们就可以删除它。 --> 839 if not isinstance(self._keras_model.call, _def_function.Function): 840 # Pass
keep_original_batch_size=True
will ensure that we get an input 841 # signature including the batch dimension specified by the user. --> 839 if not isinstance(self._keras_model.call, _def_function.Function): 840 # Passkeep_original_batch_size=True
将确保我们得到一个输入 841 # 签名,包括用户指定的批次维度。AttributeError: '_UserObject' object has no attribute 'call' AttributeError: '_UserObject' object 没有属性 'call'
Can anyone please help me with this?谁能帮我解决这个问题?
I think the problem is not about the variable input shape (while the error message is confusing).我认为问题不在于可变输入形状(而错误消息令人困惑)。
tf.saved_model.load
returns a SavedModel
, but tf.lite.TFLiteConverter.from_keras_model
expects a Keras model so it couldn't handle it. tf.saved_model.load
返回一个SavedModel
,但tf.lite.TFLiteConverter.from_keras_model
需要一个 Keras model 所以它无法处理它。
You need to use the TFLiteConverter.from_saved_model API.您需要使用TFLiteConverter.from_saved_model API。 Something like this:像这样的东西:
saved_model_dir = '/content/drive/MyDrive/FINAL DNET MODEL/inference_graph/saved_model/'
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
Let us know if you run into other issues.如果您遇到其他问题,请告诉我们。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.