简体   繁体   中英

How to correctly create saved_model.pb?

I have been trying to create a saved_model.pb file (from .ckpt , .meta files) that is needed in order to do inference. I can successfully create a file which contains saved_model.pb and variables, however when I deploy my script, I get a KeyError on the expected tensors:

y_probs = [my_predictor._fetch_tensors['y_prob_{}'.format(p)] for p in protocols]

KeyError: 'y_prob_protocol1'

The problem is probably in how I've defined my inputs/outputs (see code at the end) because the feed and fetch tensors are empty as you can see below:

my_predictor = predictor.from_saved_model('export')

SavedModelPredictor with feed tensors {} and fetch_tensors {}
saver = tf.train.import_meta_graph(opts.model)
builder = tf.saved_model.builder.SavedModelBuilder(opts.out_path)

with tf.Session() as sess:
    # Restore variables from disk.
    saver.restore(sess, opts.checkpoint)
    print("Model restored.")

    input_tensor = tf.placeholder(tf.float32, shape=(None,128,128,128,1), name='tensors/component_0')
    tensor_1 = tf.placeholder(tf.float32, shape=(None,128,128,128,2), name='pred/Reshape')
    tensor_2 = tf.placeholder(tf.float32, shape=(None,128,128,128,3), name='pred1/Reshape')


    tensor_info_input = tf.saved_model.utils.build_tensor_info(input_tensor)
    tensor_info_1 = tf.saved_model.utils.build_tensor_info(tensor_1)
    tensor_info_2 = tf.saved_model.utils.build_tensor_info(tensor_2)


    prediction_signature = (
        tf.saved_model.signature_def_utils.build_signature_def(
            inputs={'x': tensor_info_input},
            outputs={'y_prob_protocol1': tensor_info_1, 'y_prob_protocol2':tensor_info_2},
            method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME))

    builder.add_meta_graph_and_variables(
        sess, [tf.saved_model.tag_constants.SERVING],
        signature_def_map={
            'predict_images':
                prediction_signature,
        })

    builder.save()   

Thank you for your help !

I suspect there might be 2 reasons for this error:

  1. The Restored Model (Saved using Check Points) might not be properly linked to the builder.save() 's, saved_model.pb file.

  2. You have used 2 outputs, tensor_info_1 and tensor_info_2 in the SignatureDef. But they are not defined (at least in the code shown). By definition, I mean something, like,

    y = tf.nn.softmax(tf.matmul(x, w) + b, name='y') .

You can use this simple script to convert from Checkpoints and Meta Files to .pb file. But you must specify the names of the output nodes.

import tensorflow as tf

meta_path = 'model.ckpt-22480.meta' # Your .meta file
output_node_names = ['output:0']    # Output nodes

with tf.Session() as sess:

    # Restore the graph
    saver = tf.train.import_meta_graph(meta_path)

    # Load weights
    saver.restore(sess,tf.train.latest_checkpoint('.'))

    # Freeze the graph
    frozen_graph_def = tf.graph_util.convert_variables_to_constants(
        sess,
        sess.graph_def,
        output_node_names)

    # Save the frozen graph
    with open('output_graph.pb', 'wb') as f:
      f.write(frozen_graph_def.SerializeToString())

This conversion is too much of work. Instead of Saving the Model to Check Points and then trying to convert it to .pb file , you can Save the Model, Graphs and the SignatureDefs directly to .pb file either using SavedModelBuilder or using export_saved_model .

Example code for Saving a Model using SavedModelBuilder is given in the below link.

It is the official code provided by Google Tensorflow Serving Team and following this code (flow and structure) would be recommended.

https://github.com/tensorflow/serving/blob/master/tensorflow_serving/example/mnist_saved_model.py

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM