简体   繁体   中英

@tensorflow/tfjs-node: Error: Failed to load SavedModel: Op type not registered 'NonMaxSuppressionV5' in binary running

The issue

I developed a simple NodeJS app for object detection using @tensorflow/tfjs-node . Everything works fine on my development PC (Windows 10 Pro), but trying to execute on my Raspberry Pi 2B (Raspbian 10), I got the following error:

Overriding the gradient for 'Max'
Overriding the gradient for 'OneHot'
Overriding the gradient for 'PadV2'
Overriding the gradient for 'SpaceToBatchND'
Overriding the gradient for 'SplitV'
2020-07-31 11:25:12.068892: I tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: ./assets/saved_model
2020-07-31 11:25:12.643852: I tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
2020-07-31 11:25:13.206821: I tensorflow/cc/saved_model/loader.cc:311] SavedModel load for tags { serve }; Status: fail. Took 1137915 microseconds.
Error: Failed to load SavedModel: Op type not registered 'NonMaxSuppressionV5' in binary running on raspberrypi. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
    at NodeJSKernelBackend.loadSavedModelMetaGraph (/home/pi/storage/tensorflow-test-node/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:1588:29)
    at Object.<anonymous> (/home/pi/storage/tensorflow-test-node/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:429:45)
    at step (/home/pi/storage/tensorflow-test-node/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:48:23)
    at Object.next (/home/pi/storage/tensorflow-test-node/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:29:53)
    at fulfilled (/home/pi/storage/tensorflow-test-node/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:20:58)

I can reproduce it with the following lines:

const tf = require('@tensorflow/tfjs-node');

// Native SavedModel: ./assets/saved_model/saved_model.pb
const objectDetectionModel = await tf.node.loadSavedModel('./assets/saved_model'); // Error

// ...

I supose that the error is related with the SavedModel version, but I don't know how convert it to use in the Rapsberry Pi or why the NodeJS app needs different SavedModel if I execute in Windows or Raspbian.

Details

Enviroment

  • Development :
    • OS: Windows 10 Pro
    • NodeJS: v12.16.2
    • NPM: 6.11.3
  • Target (Raspberry PI):
    • OS: Raspbian 10
    • NodeJS: v12.18.3
    • NPM: 6.14.6

NodeJS app

@tensorflow/tfjs-node@2.0.1 is the only dependency declared in the package.json .

Training

The model was trained on Python following this guide (TensorFlow version used was 1.15.2 ).

SavedModel

Details of SavedModel (command saved_model_cli show --dir saved_model --tag_set serve --signature_def serving_default executed):

The given SavedModel SignatureDef contains the following input(s):
  inputs['inputs'] tensor_info:
      dtype: DT_INT32
      shape: (-1, -1, -1, 3)
      name: image_tensor:0
The given SavedModel SignatureDef contains the following output(s):
  outputs['detection_boxes'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 300, 4)
      name: detection_boxes:0
  outputs['detection_classes'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 300)
      name: detection_classes:0
  outputs['detection_multiclass_scores'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 300, 37)
      name: detection_multiclass_scores:0
  outputs['detection_scores'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 300)
      name: detection_scores:0
  outputs['num_detections'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1)
      name: num_detections:0
  outputs['raw_detection_boxes'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 300, 4)
      name: raw_detection_boxes:0
  outputs['raw_detection_scores'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 300, 37)
      name: raw_detection_scores:0
Method name is: tensorflow/serving/predict

You need to convert your model for Tensorflow Lite (with reduced ops). The error received is due to the lack of ops available on raspberry pi when loading a desktop compiled model (with higher ops available). Read more about ops here: https://www.tensorflow.org/lite/guide/ops_select

There's already is a build script that exports the model to TF Lite, similar to the one you're using (same folder in official examples repo). The functionality is the same, however the input format is slightly different. Check it out: https://www.github.com/tensorflow/models/tree/master/research%2Fobject_detection%2Fexport_tflite_ssd_graph.py

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM