简体   繁体   中英

how to set signature name to serving_default to avoid errors after deployment to GCP AI Platform

I trained a DNN model with Tensorflow on AI Platform. Then I copied the model locally to double check if preditions can be obtained from the samed model.

gcloud ai-platform local predict --model-dir=/home/jupyter/end-to-end-ml/examples/e2e-ml-model-ex02/app/appbabyweight_trained/export/exporter/1615197796 --json-instances=inputs.json

Predictions are obtained with some warnings.

If the signature defined in the model is not `serving_default` then you must specify it via --signature-name flag, otherwise the command may fail.

(This warning can be avoided when specifying signature name as follows: --signature-name predict )

After deploying the model to AI Platform, the warning become an error. Serving signature name must be serving_default as it can be seen in the error message below:

{ "error": "Serving signature name: "serving_default" not found in signature def" }

After checking the saved model with this command:

saved_model_cli show --dir /home/jupyter/end-to-end-ml/examples/e2e-ml-model-ex02/app/appbabyweight_trained2/output-dir/export/exporter/1615439076 --all
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['predict']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['gestation_weeks'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1)
        name: Placeholder_3:0
    inputs['is_male'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: Placeholder:0
    inputs['mother_age'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1)
        name: Placeholder_1:0
    inputs['plurality'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: Placeholder_2:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['predictions'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 1)
        name: dnn/logits/BiasAdd:0
  Method name is: tensorflow/serving/predict

So, the signature name of my saved model is then predict .

The question is: how to change the signature name?

PS: Below how I defined the DNN:

# Define feature columns
def get_categorical(name, values):
    return tf.feature_column.indicator_column(
        tf.feature_column.categorical_column_with_vocabulary_list(name, values))

def get_cols():
    # Define column types
    return [\
            get_categorical('is_male', ['True', 'False', 'Unknown']),
            tf.feature_column.numeric_column('mother_age'),
            get_categorical('plurality',
                        ['Single(1)', 'Twins(2)', 'Triplets(3)',
                         'Quadruplets(4)', 'Quintuplets(5)','Multiple(2+)']),
            tf.feature_column.numeric_column('gestation_weeks')
        ]

# Create serving input function to be able to serve predictions later using provided inputs
def serving_input_fn():
    feature_placeholders = {
        'is_male': tf.compat.v1.placeholder(tf.string, [None]),
        'mother_age': tf.compat.v1.placeholder(tf.float32, [None]),
        'plurality': tf.compat.v1.placeholder(tf.string, [None]),
        'gestation_weeks': tf.compat.v1.placeholder(tf.float32, [None])
    }
    features = {
        key: tf.expand_dims(tensor, -1) for key, tensor in feature_placeholders.items()
    }
    return tf.estimator.export.ServingInputReceiver(features, feature_placeholders)

# Create estimator to train and evaluate
def train_and_evaluate(args):
    
    EVAL_INTERVAL = 30
    run_config = tf.estimator.RunConfig(save_checkpoints_secs = EVAL_INTERVAL, keep_checkpoint_max = 3)

    estimator = tf.estimator.DNNRegressor(
                        model_dir = args['output_dir'],
                        feature_columns = get_cols(),
                        hidden_units = args['nnsize'],
                        config = run_config)
    train_spec = tf.estimator.TrainSpec(
                        input_fn = read_dataset(args['train_data_path'],
                                    mode = tf.estimator.ModeKeys.TRAIN,
                                    batch_size =args['batch_size']),
                        max_steps = TRAIN_STEPS)
    exporter = tf.estimator.LatestExporter('exporter', serving_input_fn)
    eval_spec = tf.estimator.EvalSpec(
                        input_fn = read_dataset(args['eval_data_path'], mode = tf.estimator.ModeKeys.EVAL, batch_size =args['batch_size']),
                        steps = args['eval_steps'],
                        start_delay_secs = 60, # start evaluating after N seconds
                        throttle_secs = EVAL_INTERVAL,  # evaluate every N seconds
                        exporters = exporter)
    tf.estimator.train_and_evaluate(estimator, train_spec, eval_spec)

Thank you

To add signature serving_default to existing saved_model

import tensorflow as tf
m = tf.saved_model.load("tf2-preview_inception_v3_classification_4")
print(m.signatures) # _SignatureMap({}) - Empty
t_spec = tf.TensorSpec([None,None,None,3], tf.float32)
c_func = m.__call__.get_concrete_function(inputs=t_spec)
signatures = {'serving_default': c_func}
tf.saved_model.save(m, 'tf2-preview_inception_v3_classification_5', signatures=signatures)

# Test new model
m5 = tf.saved_model.load("tf2-preview_inception_v3_classification_5")
print(m5.signatures) # _SignatureMap({'serving_default': <ConcreteFunction signature_wrapper(*, inputs) at 0x17316DC50>})

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM