![](/img/trans.png)
[英]Converting Tensorflow Frozen Graph to UFF for TensorRT inference
[英]How to save TensorRT graph generated from frozen inference graph?
我使用以下脚本将我的Frozen_inference_graph转换为TensorRT优化的一个:
import tensorflow as tf
from tensorflow.python.compiler.tensorrt import trt_convert as trt
with tf.Session() as sess:
# First deserialize your frozen graph:
with tf.gfile.GFile('frozen_inference_graph.pb', 'rb') as f:
frozen_graph = tf.GraphDef()
frozen_graph.ParseFromString(f.read())
# Now you can create a TensorRT inference graph from your
# frozen graph:
converter = trt.TrtGraphConverter(
input_graph_def=frozen_graph,
nodes_blacklist=['outputs/Softmax']) #output nodes
trt_graph = converter.convert()
# Import the TensorRT graph into a new graph and run:
output_node = tf.import_graph_def(
trt_graph,
return_elements=['outputs/Softmax'])
sess.run(output_node)
我的问题是如何将优化后的图形保存到磁盘,以便可以使用它来进行推理?
是的,您可以只添加这两行:
saved_model_dir_trt = "./tensorrt_model.trt" converter.save(saved_model_dir_trt)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.