简体   繁体   中英

Continue training on SavedModel or load checkpoint from SavedModel

In tensorflow 1.14, it's obvious that tf.compat.v1.train.init_from_checkpoint can load ckpt to continue training (or to warm start). However, I couldn't find any corresponding approaches in SavedModel , and tf.estimator.WarmStartSetting only supports ckpt as well. It's weird for me, because this answer mentioned that there should be a checkpoint stored in SavedModel . Does anyone know:

  1. How to load checkpoint in SavedModel? or
  2. How to warm start training on SavedModel?

In order to load SavedModel to continue training, you can use tf.saved_model.loader.load like follows:

graph = tf.Graph()
with tf.Session(graph=graph) as sess:
  tf.saved_model.loader.load(sess, [tag_constants.SERVING], saved_model_location)

In order to feed new input data, you can get the input tensor names like follows:

signature_def = meta_graph_def.signature_def[tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY]
inputs = [v.name for v in signature_def.inputs.values()]
input_tensors = [node.split(":")[0] for node in inputs]

then you can make some kinds of feed_dict to feed new inputs to the input tensors. Getting output tensors can be done similarly to the method I outlined above.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM