簡體   English   中英

如何導出具有關鍵點的 tensorflow 2 模型?

[英]How to export a tensorflow 2 model that has keypoints?

我正在嘗試使用 Tensorflow 2 對象檢測 API 中的/exporter_main_v2.py腳本導出模型CenterNet MobileNetV2 FPN Keypoints 512x512

該模型在此處列出: https ://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/tf2_detection_zoo.md

我按照此處的說明使用檢測 API 構建了一個 docker 映像: https ://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/tf2.md

Dockerfile:

FROM ubuntu:22.04

RUN apt update

RUN apt install -y git wget

WORKDIR /tensorflow
RUN git clone https://github.com/tensorflow/models.git

RUN apt install -y protobuf-compiler software-properties-common python3 python3-pip python-is-python3

WORKDIR /tensorflow/models/research
RUN protoc object_detection/protos/*.proto --python_out=.

# Install TensorFlow Object Detection API.
RUN cp object_detection/packages/tf2/setup.py .
RUN python -m pip install --use-feature=2020-resolver .

構建泊塢窗:

#!/bin/bash

docker build \
-t tensorflow-object-detection .

運行 Docker:

#!/bin/bash

docker run \
-it \
-v $(pwd):/workspace \
tensorflow-object-detection

成功導出沒有關鍵點的模型:

cd /workspace

wget http://download.tensorflow.org/models/object_detection/tf2/20210210/centernet_mobilenetv2fpn_512x512_coco17_od.tar.gz

tar -xvf centernet_mobilenetv2fpn_512x512_coco17_od.tar.gz

# This works!
python /tensorflow/models/research/object_detection/exporter_main_v2.py \
--input_type float_image_tensor \
--trained_checkpoint_dir /workspace/centernet_mobilenetv2_fpn_od/checkpoint/ \
--pipeline_config_path /workspace/centernet_mobilenetv2_fpn_od/pipeline.config \
--output_directory /workspace/centernet_mobilenetv2fpn_512x512_coco17_od_exported/

但是,如果我對具有關鍵點的模型執行相同的過程,則會出現錯誤:

cd /workspace

wget http://download.tensorflow.org/models/object_detection/tf2/20210210/centernet_mobilenetv2fpn_512x512_coco17_kpts.tar.gz

tar -xvf centernet_mobilenetv2fpn_512x512_coco17_kpts.tar.gz

# Fails!
python /tensorflow/models/research/object_detection/exporter_main_v2.py \
--input_type float_image_tensor \
--trained_checkpoint_dir /workspace/centernet_mobilenetv2_fpn_kpts/checkpoint/ \
--pipeline_config_path /workspace/centernet_mobilenetv2_fpn_kpts/pipeline.config \
--output_directory /workspace/centernet_mobilenetv2fpn_512x512_coco17_kpts_exported/

錯誤:

python /tensorflow/models/research/object_detection/exporter_main_v2.py --input_type float_image_tensor --trained_checkpoint_dir /workspace/centernet_mobilenetv2_fpn_kpts/checkpoint/ --pipeline_config_path /workspace/centernet_mobilenetv2_fpn_kpts/pipeline.config --output_directory /workspace/centernet_mobilenetv2fpn_512x512_coco17_kpts_exported/
2022-06-25 21:20:40.339771: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcudart.so.11.0'; dlerror: libcudart.so.11.0: cannot open shared object file: No such file or directory
2022-06-25 21:20:40.339792: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.
2022-06-25 21:20:42.885043: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcuda.so.1'; dlerror: libcuda.so.1: cannot open shared object file: No such file or directory
2022-06-25 21:20:42.885063: W tensorflow/stream_executor/cuda/cuda_driver.cc:269] failed call to cuInit: UNKNOWN ERROR (303)
2022-06-25 21:20:42.885079: I tensorflow/stream_executor/cuda/cuda_diagnostics.cc:163] no NVIDIA GPU device is present: /dev/nvidia0 does not exist
WARNING:tensorflow:`input_shape` is undefined or non-square, or `rows` is not in [96, 128, 160, 192, 224]. Weights for input shape (224, 224) will be loaded as the default.
W0625 21:20:42.890892 140467532775424 mobilenet_v2.py:303] `input_shape` is undefined or non-square, or `rows` is not in [96, 128, 160, 192, 224]. Weights for input shape (224, 224) will be loaded as the default.
2022-06-25 21:20:42.891244: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
Traceback (most recent call last):
  File "/tensorflow/models/research/object_detection/exporter_main_v2.py", line 164, in <module>
    app.run(main)
  File "/usr/local/lib/python3.10/dist-packages/absl/app.py", line 312, in run
    _run_main(main, args)
  File "/usr/local/lib/python3.10/dist-packages/absl/app.py", line 258, in _run_main
    sys.exit(main(argv))
  File "/tensorflow/models/research/object_detection/exporter_main_v2.py", line 157, in main
    exporter_lib_v2.export_inference_graph(
  File "/usr/local/lib/python3.10/dist-packages/object_detection/exporter_lib_v2.py", line 244, in export_inference_graph
    detection_model = INPUT_BUILDER_UTIL_MAP['model_build'](
  File "/usr/local/lib/python3.10/dist-packages/object_detection/builders/model_builder.py", line 1252, in build
    return build_func(getattr(model_config, meta_architecture), is_training,
  File "/usr/local/lib/python3.10/dist-packages/object_detection/builders/model_builder.py", line 1118, in _build_center_net_model
    label_map_proto = label_map_util.load_labelmap(
  File "/usr/local/lib/python3.10/dist-packages/object_detection/utils/label_map_util.py", line 168, in load_labelmap
    label_map_string = fid.read()
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/lib/io/file_io.py", line 114, in read
    self._preread_check()
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/lib/io/file_io.py", line 76, in _preread_check
    self._read_buf = _pywrap_file_io.BufferedInputStream(
tensorflow.python.framework.errors_impl.NotFoundError: PATH_TO_BE_CONFIGURED/label_map.txt; No such file or directory

這似乎與PATH_TO_BE_CONFIGURED條目有關 - 但我不明白的是,另一個導出的模型也有這些? 有任何想法嗎?

編輯:

如果我添加:

keypoint_label_map_path: "/workspace/centernet_mobilenetv2_fpn_kpts/label_map.txt"
    
...

train_input_reader {
  label_map_path: "/tensorflow/models/research/object_detection/data/mscoco_label_map.pbtxt"

...

eval_input_reader {
  label_map_path: "/tensorflow/models/research/object_detection/data/mscoco_label_map.pbtxt"

並嘗試再次運行我得到:

python /tensorflow/models/research/object_detection/exporter_main_v2.py --input_type float_image_tensor --trained_checkpoint_dir /workspace/centernet_mobilenetv2_fpn_kpts/checkpoint/ --pipeline_config_path /workspace/centernet_mobilenetv2_fpn_kpts/pipeline.config --output_directory /workspace/centernet_mobilenetv2fpn_512x512_coco17_kpts_exported/
2022-06-26 07:54:41.963666: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcudart.so.11.0'; dlerror: libcudart.so.11.0: cannot open shared object file: No such file or directory
2022-06-26 07:54:41.963690: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.
2022-06-26 07:54:44.513790: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcuda.so.1'; dlerror: libcuda.so.1: cannot open shared object file: No such file or directory
2022-06-26 07:54:44.513817: W tensorflow/stream_executor/cuda/cuda_driver.cc:269] failed call to cuInit: UNKNOWN ERROR (303)
2022-06-26 07:54:44.513834: I tensorflow/stream_executor/cuda/cuda_diagnostics.cc:163] no NVIDIA GPU device is present: /dev/nvidia0 does not exist
WARNING:tensorflow:`input_shape` is undefined or non-square, or `rows` is not in [96, 128, 160, 192, 224]. Weights for input shape (224, 224) will be loaded as the default.
W0626 07:54:44.520438 140465840205824 mobilenet_v2.py:303] `input_shape` is undefined or non-square, or `rows` is not in [96, 128, 160, 192, 224]. Weights for input shape (224, 224) will be loaded as the default.
2022-06-26 07:54:44.520864: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
WARNING:tensorflow:From /usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/impl/api.py:458: calling map_fn_v2 (from tensorflow.python.ops.map_fn) with back_prop=False is deprecated and will be removed in a future version.
Instructions for updating:
back_prop=False is deprecated. Consider using tf.stop_gradient instead.
Instead of:
results = tf.map_fn(fn, elems, back_prop=False)
Use:
results = tf.nest.map_structure(tf.stop_gradient, tf.map_fn(fn, elems))
W0626 07:54:46.972441 140465840205824 deprecation.py:623] From /usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/impl/api.py:458: calling map_fn_v2 (from tensorflow.python.ops.map_fn) with back_prop=False is deprecated and will be removed in a future version.
Instructions for updating:
back_prop=False is deprecated. Consider using tf.stop_gradient instead.
Instead of:
results = tf.map_fn(fn, elems, back_prop=False)
Use:
results = tf.nest.map_structure(tf.stop_gradient, tf.map_fn(fn, elems))
Traceback (most recent call last):
  File "/tensorflow/models/research/object_detection/exporter_main_v2.py", line 164, in <module>
    app.run(main)
  File "/usr/local/lib/python3.10/dist-packages/absl/app.py", line 312, in run
    _run_main(main, args)
  File "/usr/local/lib/python3.10/dist-packages/absl/app.py", line 258, in _run_main
    sys.exit(main(argv))
  File "/tensorflow/models/research/object_detection/exporter_main_v2.py", line 157, in main
    exporter_lib_v2.export_inference_graph(
  File "/usr/local/lib/python3.10/dist-packages/object_detection/exporter_lib_v2.py", line 270, in export_inference_graph
    concrete_function = detection_module.__call__.get_concrete_function()
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/eager/def_function.py", line 1239, in get_concrete_function
    concrete = self._get_concrete_function_garbage_collected(*args, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/eager/def_function.py", line 1219, in _get_concrete_function_garbage_collected
    self._initialize(args, kwargs, add_initializers_to=initializers)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/eager/def_function.py", line 785, in _initialize
    self._stateful_fn._get_concrete_function_internal_garbage_collected(  # pylint: disable=protected-access
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/eager/function.py", line 2480, in _get_concrete_function_internal_garbage_collected
    graph_function, _ = self._maybe_define_function(args, kwargs)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/eager/function.py", line 2711, in _maybe_define_function
    graph_function = self._create_graph_function(args, kwargs)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/eager/function.py", line 2627, in _create_graph_function
    func_graph_module.func_graph_from_py_func(
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/framework/func_graph.py", line 1141, in func_graph_from_py_func
    func_outputs = python_func(*func_args, **func_kwargs)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/eager/def_function.py", line 677, in wrapped_fn
    out = weak_wrapped_fn().__wrapped__(*args, **kwds)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/eager/function.py", line 3251, in bound_method_wrapper
    return wrapped_fn(*args, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/framework/func_graph.py", line 1127, in autograph_handler
    raise e.ag_error_metadata.to_exception(e)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/framework/func_graph.py", line 1116, in autograph_handler
    return autograph.converted_call(
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/impl/api.py", line 439, in converted_call
    result = converted_f(*effective_args, **kwargs)
  File "/tmp/__autograph_generated_filegn72uv59.py", line 13, in tf____call__
    retval_ = ag__.converted_call(ag__.ld(self)._run_inference_on_images, (ag__.ld(images), ag__.ld(true_shapes)), None, fscope)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/impl/api.py", line 441, in converted_call
    result = converted_f(*effective_args)
  File "/tmp/__autograph_generated_filemo8dzp2i.py", line 22, in tf___run_inference_on_images
    detections = ag__.converted_call(ag__.ld(self)._model.postprocess, (ag__.ld(prediction_dict), ag__.ld(true_shapes)), None, fscope)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/impl/api.py", line 441, in converted_call
    result = converted_f(*effective_args)
  File "/tmp/__autograph_generated_fileqexjpupt.py", line 213, in tf__postprocess
    ag__.if_stmt(ag__.ld(self)._kp_params_dict, if_body_7, else_body_7, get_state_7, set_state_7, ('boxes',), 1)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/operators/control_flow.py", line 1341, in if_stmt
    _py_if_stmt(cond, body, orelse)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/operators/control_flow.py", line 1394, in _py_if_stmt
    return body() if cond else orelse()
  File "/tmp/__autograph_generated_fileqexjpupt.py", line 184, in if_body_7
    ag__.if_stmt(ag__.and_(lambda : ag__.converted_call(ag__.ld(len), (ag__.ld(self)._kp_params_dict,), None, fscope) == 1, lambda : ag__.ld(self)._num_classes == 1), if_body_5, else_body_5, get_state_5, set_state_5, ('keypoint_scores', 'keypoints'), 2)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/operators/control_flow.py", line 1341, in if_stmt
    _py_if_stmt(cond, body, orelse)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/operators/control_flow.py", line 1394, in _py_if_stmt
    return body() if cond else orelse()
  File "/tmp/__autograph_generated_fileqexjpupt.py", line 158, in if_body_5
    ag__.if_stmt(ag__.ld(kp_params).argmax_postprocessing, if_body_3, else_body_3, get_state_3, set_state_3, ('keypoint_depths', 'keypoint_scores', 'keypoints'), 3)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/operators/control_flow.py", line 1341, in if_stmt
    _py_if_stmt(cond, body, orelse)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/operators/control_flow.py", line 1394, in _py_if_stmt
    return body() if cond else orelse()
  File "/tmp/__autograph_generated_fileqexjpupt.py", line 155, in else_body_3
    (keypoints, keypoint_scores, keypoint_depths) = ag__.converted_call(ag__.ld(self)._postprocess_keypoints_single_class, (ag__.ld(prediction_dict), ag__.ld(channel_indices), ag__.ld(y_indices), ag__.ld(x_indices), ag__.ld(boxes_strided), ag__.ld(num_detections)), None, fscope)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/impl/api.py", line 441, in converted_call
    result = converted_f(*effective_args)
  File "/tmp/__autograph_generated_file1eqm6c24.py", line 82, in tf___postprocess_keypoints_single_class
    ag__.for_stmt(ag__.converted_call(ag__.ld(range), (ag__.ld(batch_size),), None, fscope), None, loop_body, get_state_1, set_state_1, (), {'iterate_names': 'ex_ind'})
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/operators/control_flow.py", line 418, in for_stmt
    _tf_range_for_stmt(iter_, extra_test, body, get_state, set_state,
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/operators/control_flow.py", line 654, in _tf_range_for_stmt
    _tf_while_stmt(
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/operators/control_flow.py", line 1277, in _tf_while_stmt
    final_loop_vars = control_flow_ops.while_loop(
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/ops/control_flow_ops.py", line 2705, in while_loop
    return while_v2.while_loop(
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/ops/while_v2.py", line 213, in while_loop
    body_graph = func_graph_module.func_graph_from_py_func(
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/framework/func_graph.py", line 1141, in func_graph_from_py_func
    func_outputs = python_func(*func_args, **func_kwargs)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/ops/while_v2.py", line 198, in wrapped_body
    outputs = body(
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/operators/control_flow.py", line 1248, in aug_body
    body()
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/operators/control_flow.py", line 630, in aug_body
    body(iterate)
  File "/tmp/__autograph_generated_file1eqm6c24.py", line 74, in loop_body
    (kpt_coords_for_class, kpt_scores_for_class, kpt_depths_for_class) = ag__.converted_call(ag__.ld(self)._postprocess_keypoints_for_class_and_image, (ag__.ld(keypoint_heatmap), ag__.ld(keypoint_offsets), ag__.ld(keypoint_regression), ag__.ld(classes), ag__.ld(y_indices), ag__.ld(x_indices), ag__.ld(boxes), ag__.ld(ex_ind), ag__.ld(kp_params)), dict(keypoint_depth_predictions=ag__.ld(keypoint_depth_predictions)), fscope)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/impl/api.py", line 439, in converted_call
    result = converted_f(*effective_args, **kwargs)
  File "/tmp/__autograph_generated_filemz16s7ql.py", line 88, in tf___postprocess_keypoints_for_class_and_image
    (keypoint_candidates, keypoint_scores, num_keypoint_candidates, keypoint_depth_candidates) = ag__.converted_call(ag__.ld(prediction_tensors_to_keypoint_candidates), (ag__.ld(keypoint_heatmap), ag__.ld(keypoint_offsets)), dict(keypoint_score_threshold=ag__.ld(kp_params).keypoint_candidate_score_threshold, max_pool_kernel_size=ag__.ld(kp_params).peak_max_pool_kernel_size, max_candidates=ag__.ld(kp_params).num_candidates_per_keypoint, keypoint_depths=ag__.ld(keypoint_depths)), fscope)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/impl/api.py", line 439, in converted_call
    result = converted_f(*effective_args, **kwargs)
  File "/tmp/__autograph_generated_file4b5h_wl9.py", line 47, in tf__prediction_tensors_to_keypoint_candidates
    (keypoint_scores, y_indices, x_indices, channel_indices) = ag__.converted_call(ag__.ld(top_k_feature_map_locations), (ag__.ld(keypoint_heatmap_predictions),), dict(max_pool_kernel_size=ag__.ld(max_pool_kernel_size), k=ag__.ld(max_candidates), per_channel=True), fscope)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/impl/api.py", line 439, in converted_call
    result = converted_f(*effective_args, **kwargs)
  File "/tmp/__autograph_generated_fileawxrt7nt.py", line 150, in tf__top_k_feature_map_locations
    ag__.if_stmt(ag__.ld(per_channel), if_body_3, else_body_3, get_state_3, set_state_3, ('peak_flat_indices', 'scores'), 2)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/operators/control_flow.py", line 1341, in if_stmt
    _py_if_stmt(cond, body, orelse)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/operators/control_flow.py", line 1394, in _py_if_stmt
    return body() if cond else orelse()
  File "/tmp/__autograph_generated_fileawxrt7nt.py", line 111, in if_body_3
    ag__.if_stmt(ag__.ld(k) == 1, if_body_1, else_body_1, get_state_1, set_state_1, ('peak_flat_indices', 'scores'), 2)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/operators/control_flow.py", line 1341, in if_stmt
    _py_if_stmt(cond, body, orelse)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/operators/control_flow.py", line 1394, in _py_if_stmt
    return body() if cond else orelse()
  File "/tmp/__autograph_generated_fileawxrt7nt.py", line 104, in else_body_1
    scores = ag__.converted_call(ag__.ld(tf).ensure_shape, (ag__.ld(scores), (ag__.ld(batch_size), ag__.ld(num_channels), ag__.ld(k))), None, fscope)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/impl/api.py", line 377, in converted_call
    return _call_unconverted(f, args, kwargs, options)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/autograph/impl/api.py", line 459, in _call_unconverted
    return f(*args)
  File "/usr/local/lib/python3.10/dist-packages/tensorflow/python/util/traceback_utils.py", line 153, in error_handler
    raise e.with_traceback(filtered_tb) from None
  File "<string>", line 3, in raise_from
TypeError: in user code:

    File "/usr/local/lib/python3.10/dist-packages/object_detection/exporter_lib_v2.py", line 189, in __call__  *
        true_shapes)
    File "/usr/local/lib/python3.10/dist-packages/object_detection/exporter_lib_v2.py", line 126, in _run_inference_on_images  *
        detections = self._model.postprocess(prediction_dict, true_shapes)
    File "/usr/local/lib/python3.10/dist-packages/object_detection/meta_architectures/center_net_meta_arch.py", line 4154, in postprocess  *
        (keypoints, keypoint_scores,
    File "/usr/local/lib/python3.10/dist-packages/object_detection/meta_architectures/center_net_meta_arch.py", line 4590, in _postprocess_keypoints_single_class  *
        (kpt_coords_for_class, kpt_scores_for_class, kpt_depths_for_class) = (
    File "/usr/local/lib/python3.10/dist-packages/object_detection/meta_architectures/center_net_meta_arch.py", line 4726, in _postprocess_keypoints_for_class_and_image  *
        (keypoint_candidates, keypoint_scores, num_keypoint_candidates,
    File "/usr/local/lib/python3.10/dist-packages/object_detection/meta_architectures/center_net_meta_arch.py", line 513, in prediction_tensors_to_keypoint_candidates  *
        keypoint_scores, y_indices, x_indices, channel_indices = (
    File "/usr/local/lib/python3.10/dist-packages/object_detection/meta_architectures/center_net_meta_arch.py", line 342, in top_k_feature_map_locations  *
        scores = tf.ensure_shape(scores, (batch_size, num_channels, k))
    File "<string>", line 3, in raise_from
        

    TypeError: Dimension value must be integer or None or have an __index__ method, got value '<tf.Tensor 'while/strided_slice_9:0' shape=() dtype=int32>' with type '<class 'tensorflow.python.framework.ops.Tensor'>'

但是,不確定這些標簽文件是否正確。 有什么線索嗎?

最終我無法克服exporter_main_v2.py的錯誤,但是我的最終目標是將模型轉換為 onnx 並且以下內容適用於此:

pip install -U tf2onnx

python -m tf2onnx.convert \
--saved-model /workspace/centernet_mobilenetv2_fpn_od/saved_model/ \
--output centernet_mobilenetv2_fpn_od.onnx

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM