繁体   English   中英

将使用 AutoML 表创建的 Tensorflow 1 pb model 转换为 TensorflowJS 以在 NodeJS 中运行

[英]Convert Tensorflow 1 pb model created with AutoML Tables to TensorflowJS to run in NodeJS

I have exported a Tensorflow model with GCloud AutoML Tables and I'm trying to convert it to tensorflowjs json model, but when running the converter, I'm getting the next error:

Op type not registered 'DecodeProtoSparseV2'

我正在使用Python 3.8.4tensorflowjs 2.0.1.post1

这是完整的 output:

λ tensorflowjs_converter --input_format=tf_saved_model --output_node_names=Test --saved_model_tags=serve . web_model

Traceback (most recent call last):
  File "c:\program files\python38\lib\runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "c:\program files\python38\lib\runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\Scripts\tensorflowjs_converter.exe\__main__.py", line 7, in <module>
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\site-packages\tensorflowjs\converters\converter.py", line 735, in pip_main
    main([' '.join(sys.argv[1:])])
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\site-packages\tensorflowjs\converters\converter.py", line 739, in main
    convert(argv[0].split(' '))
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\site-packages\tensorflowjs\converters\converter.py", line 673, in convert
    tf_saved_model_conversion_v2.convert_tf_saved_model(
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\site-packages\tensorflowjs\converters\tf_saved_model_conversion_v2.py", line 469, in convert_tf_saved_model
    model = load(saved_model_dir, saved_model_tags)
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\site-packages\tensorflow\python\saved_model\load.py", line 578, in load
    return load_internal(export_dir, tags)
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\site-packages\tensorflow\python\saved_model\load.py", line 613, in load_internal
    root = load_v1_in_v2.load(export_dir, tags)
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\site-packages\tensorflow\python\saved_model\load_v1_in_v2.py", line 263, in load
    return loader.load(tags=tags)
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\site-packages\tensorflow\python\saved_model\load_v1_in_v2.py", line 207, in load
    wrapped = wrap_function.wrap_function(
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\site-packages\tensorflow\python\eager\wrap_function.py", line 604, in wrap_function
    func_graph.func_graph_from_py_func(
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\site-packages\tensorflow\python\framework\func_graph.py", line 981, in func_graph_from_py_func
    func_outputs = python_func(*func_args, **func_kwargs)
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\site-packages\tensorflow\python\eager\wrap_function.py", line 86, in __call__
    return self.call_with_variable_creator_scope(self._fn)(*args, **kwargs)
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\site-packages\tensorflow\python\eager\wrap_function.py", line 92, in wrapped
    return fn(*args, **kwargs)
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\site-packages\tensorflow\python\saved_model\load_v1_in_v2.py", line 89, in load_graph
    saver, _ = tf_saver._import_meta_graph_with_return_elements(
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\site-packages\tensorflow\python\training\saver.py", line 1481, in _import_meta_graph_with_return_elements
    meta_graph.import_scoped_meta_graph_with_return_elements(
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\site-packages\tensorflow\python\framework\meta_graph.py", line 794, in import_scoped_meta_graph_with_re

turn_elements
    imported_return_elements = importer.import_graph_def(
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\site-packages\tensorflow\python\util\deprecation.py", line 507, in new_func
    return func(*args, **kwargs)
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\site-packages\tensorflow\python\framework\importer.py", line 400, in import_graph_def
    return _import_graph_def_internal(
  File "C:\Users\MyUser\AppData\Roaming\Python\Python38\site-packages\tensorflow\python\framework\importer.py", line 496, in _import_graph_def_internal
    results = c_api.TF_GraphImportGraphDefWithResults(
tensorflow.python.framework.errors_impl.NotFoundError: Op type not registered 'DecodeProtoSparseV2' in binary running on MYCOMPUTER. Make sure the Op and Kernel are registered in the bin
ary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the grap
h, as contrib ops are lazily registered when the module is first accessed.

这是 model 的签名:

saved_model_cli show --dir . --all

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['classification']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['inputs'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: transform/transform/input_proto_tensor:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['classes'] tensor_info:
        dtype: DT_STRING
        shape: (-1, 2)
        name: head/Tile:0
    outputs['scores'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 2)
        name: head/predictions/probabilities:0
  Method name is: tensorflow/serving/classify

signature_def['predict']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['examples'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: transform/transform/input_proto_tensor:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['all_class_ids'] tensor_info:
        dtype: DT_INT32
        shape: (-1, 2)
        name: head/predictions/Tile:0
    outputs['all_classes'] tensor_info:
        dtype: DT_STRING
        shape: (-1, 2)
        name: head/predictions/Tile_1:0
    outputs['class_ids'] tensor_info:
        dtype: DT_INT64
        shape: (-1, 1)
        name: head/predictions/ExpandDims:0
    outputs['classes'] tensor_info:
        dtype: DT_STRING
        shape: (-1, 1)
        name: head/predictions/hash_table_Lookup/LookupTableFindV2:0
    outputs['logistic'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 1)
        name: head/predictions/logistic:0
    outputs['logits'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 1)
        name: gbdt_1/GradientTreesPrediction:0
    outputs['probabilities'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 2)
        name: head/predictions/probabilities:0
  Method name is: tensorflow/serving/predict

signature_def['regression']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['inputs'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: transform/transform/input_proto_tensor:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['outputs'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 1)
        name: head/predictions/logistic:0
  Method name is: tensorflow/serving/regress

signature_def['serving_default']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['inputs'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: transform/transform/input_proto_tensor:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['classes'] tensor_info:
        dtype: DT_STRING
        shape: (-1, 2)
        name: head/Tile:0
    outputs['scores'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 2)
        name: head/predictions/probabilities:0
  Method name is: tensorflow/serving/classify

实际上DecodeProtoSparseV2目前不支持您提到的 OP tensorflow.js 这就是您无法将 model 转换为tensorflow.js的原因。 tensorflow.js当前支持的操作可以在这里找到。 因此,除非您从网络中更改或删除该操作,否则您将无法将 model 转换为 js。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM