简体   繁体   English

从 .meta 文件加载已保存的 Tensorflow 模型

[英]Loading a saved Tensorflow model from its .meta file

I am trying to load a tensorflow meta graph from a saved checkpoint using Tensorflow version 1.15 to convert it to a SavedModel for tensorflow serving.我正在尝试使用 Tensorflow 1.15 版从保存的检查点加载 tensorflow 元图,以将其转换为 SavedModel 以供 tensorflow 服务。 It is a Speech Recognition Model with Local attention and unidirectional LSTM implemented using the Returnn Toolkit with Tensorflow Backend.它是一个带有局部注意力和单向 LSTM 的语音识别模型,使用带有 Tensorflow 后端的 Returnn 工具包实现。 I am using the following code.我正在使用以下代码。

import tensorflow as tf
from tensorflow.python.saved_model import signature_constants
from tensorflow.python.saved_model import tag_constants
import sys

if len(sys.argv)!=2:
        print("Usage:" + sys.argv[0] + "save_dir")
        exit(1)
export_dir=sys.argv[1]
builder = tf.compat.v1.saved_model.builder.SavedModelBuilder(export_dir)
sigs={}
with tf.Session(graph=tf.Graph()) as sess:
        new_saver=tf.train.import_meta_graph("./serv_test/model.238.meta")
        new_saver.restore(sess, tf.train.latest_checkpoint("./serv_test"))
        graph=tf.get_default_graph()
        input_audio=graph.get_tensor_by_name('inference/default/wav:0')
        output_hyps=graph.get_tensor_by_name('inference/default/Reshape_7:0')
        sigs[signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY] = tf.saved_model.signature_def_utils.predict_signature_def({"in":input_audio},{"out":output_hyps})
        builder.add_meta_graph_and_variables(sess, [tag_constants.SERVING], signature_def_map=sigs,)
builder.save()

But I am getting the following error in the import_meta_graph line:但是我在import_meta_graph行中收到以下错误:

Traceback (most recent call last):
  File "xport.py", line 16, in <module>
    new_saver=tf.train.import_meta_graph("./serv_test/model.238.meta")
  File "/home/ubuntu/tf1.15/lib/python3.6/site-packages/tensorflow_core/python/training/saver.py", line 1453, in import_meta_graph
    **kwargs)[0]
  File "/home/ubuntu/tf1.15/lib/python3.6/site-packages/tensorflow_core/python/training/saver.py", line 1477, in _import_meta_graph_with_return_elements
    **kwargs))
  File "/home/ubuntu/tf1.15/lib/python3.6/site-packages/tensorflow_core/python/framework/meta_graph.py", line 809, in import_scoped_meta_graph_with_return_elements
    return_elements=return_elements)
  File "/home/ubuntu/tf1.15/lib/python3.6/site-packages/tensorflow_core/python/util/deprecation.py", line 507, in new_func
    return func(*args, **kwargs)
  File "/home/ubuntu/tf1.15/lib/python3.6/site-packages/tensorflow_core/python/framework/importer.py", line 405, in import_graph_def
    producer_op_list=producer_op_list)
  File "/home/ubuntu/tf1.15/lib/python3.6/site-packages/tensorflow_core/python/framework/importer.py", line 501, in _import_graph_def_internal
    graph._c_graph, serialized, options)  # pylint: disable=protected-access
tensorflow.python.framework.errors_impl.NotFoundError: Op type not registered
 'NativeLstm2' in binary running on ip-10-1-21-241. Make sure the Op and Kernel
 are registered in the binary running in this process. Note that if you are loading a
 saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler`
 should be done before importing the graph, as contrib ops are lazily registered when
 the module is first accessed.

Is there any way to get around this error?有什么办法可以解决这个错误吗? Is it because of the custom built layers used in Returnn?是不是因为 Returnn 中使用了自定义构建的层? Is there any way to make a Returnn Model tensorflow servable?有没有办法使返回模型张量流可用? Thanks.谢谢。

You should remove the graph=tf.Graph() , otherwise your import_meta_graph will import it into the wrong graph.您应该删除graph=tf.Graph() ,否则您的import_meta_graph会将其导入到错误的图表中。 Just see some official TF examples how to use import_meta_graph .只需查看一些官方 TF 示例如何使用import_meta_graph

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 加载 model,使用 Tensorflow 2.0 保存,使用 Tensorflow 1.x - Loading a model, saved using Tensorflow 2.0, using Tensorflow 1.x 使用 Tensorflow Model 修剪后保存的 Model 文件大小相同 - Saved Model file size is the same after pruning with Tensorflow Model Optimization TF2.2:从 tensorflow_hub 加载已保存的 Model 失败,出现 `AttributeError: '_UserObject' object has no attribute 'summary'` - TF2.2: Loading a Saved Model from tensorflow_hub failed with `AttributeError: '_UserObject' object has no attribute 'summary'` OSerror:使用python 3.7在windows上更新anaconda中的环境后,在tensorflow keras中加载h5保存的模型 - OSerror: loading a h5 saved model in tensorflow keras after updating the environment in anaconda on windows with python 3.7 Tensorflow:如何将冻结模型转换为已保存模型 - Tensorflow: how to convert a frozen model to saved model 从已保存的 .h5 cnn 保存模型加载 val_acc 和 val_loss - Loading val_acc and val_loss from a saved .h5 cnn saved model 在子进程之间共享已保存的张量流模型 - Sharing saved model of tensorflow among child processes tensorflow 2.7 load saved model error with java - tensorflow 2.7 load saved model error with java 在 tensorflow 中使用保存的估计器时出现saved_model_cli 问题 - Issue with saved_model_cli while using saved estimator in tensorflow 如何在 TensorFlow 2.x 中加载 Tensorflow 1.x 保存的模型? - How to load Tensorflow 1.x saved model in TensorFlow 2.x?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM