I would like to change the input and output signatures of the model saved, I used tf.Module objects to build the operations of the main model.
class Generator(tf.Module):
def __init__(....):
super(Generator, self).__init__(name=name)
...
with self.name_scope:
...
@tf.Module.with_name_scope
def __call__(self, input):
...
@tf.function
def serve_function(self, input):
out = self.__call__(input)
return out
call = model.Generator.serve_function.get_concrete_function(tf.TensorSpec([None, 256, 256, 3], tf.float32))
tf.saved_model.save(model.Generator, os.path.join(train_log_dir, 'frozen'))
then I am loading the model but I have as signatures 'default_serving' and 'output_0', how can I change this?
I figured out a way to define the output signature without using tf.Module by defining a tf.function
that returns a dictionary of outputs where the keys used in the dictionary will be the output names.
# Create the model
model = ...
# Train the model
model.fit(...)
# Define where to save the model
export_path = "..."
@tf.function()
def my_predict(my_prediction_inputs):
inputs = {
'my_serving_input': my_prediction_inputs,
}
prediction = model(inputs)
return {"my_prediction_outputs": prediction}
my_signatures = my_predict.get_concrete_function(
my_prediction_inputs=tf.TensorSpec([None,None], dtype=tf.dtypes.float32, name="my_prediction_inputs")
)
# Save the model.
tf.saved_model.save(
model,
export_dir=export_path,
signatures=my_signatures
)
This produces the following signature:
signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['my_prediction_inputs'] tensor_info:
dtype: DT_FLOAT
shape: (-1, -1)
name: serving_default_my_prediction_inputs:0
The given SavedModel SignatureDef contains the following output(s):
outputs['my_prediction_outputs'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 1)
name: StatefulPartitionedCall:0
Method name is: tensorflow/serving/predict
Another way of creating the serving_default signature is:
import tensorflow as tf
import tensorflow_hub as hub
import tensorflow_text
export_dir = "./models/use/00000001"
module = hub.load("https://tfhub.dev/google/universal-sentence-encoder-multilingual-large/3")
@tf.function
def my_module_encoder(text):
inputs = {
'text': text,
}
outputs = {
'embeddings': module(text)
}
return outputs
tf.saved_model.save(
module,
export_dir,
signatures=my_module_encoder.get_concrete_function(
text=tf.TensorSpec(shape=None, dtype=tf.string)
),
options=None
)
You can look at the created SignatureDefs
signature using saved_model_cli
command as below:
$ saved_model_cli show --all --dir models/use/00000001
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
signature_def['__saved_model_init_op']:
The given SavedModel SignatureDef contains the following input(s):
The given SavedModel SignatureDef contains the following output(s):
outputs['__saved_model_init_op'] tensor_info:
dtype: DT_INVALID
shape: unknown_rank
name: NoOp
Method name is:
signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['text'] tensor_info:
dtype: DT_STRING
shape: unknown_rank
name: serving_default_text:0
The given SavedModel SignatureDef contains the following output(s):
outputs['embeddings'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 512)
name: StatefulPartitionedCall:0
Method name is: tensorflow/serving/predict
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.