简体   繁体   English

使用 Tensorflow 中保存的 model 进行推理 2:如何控制输入/输出?

[英]Inference using saved model in Tensorflow 2: how to control in/output?

Adapting my code from TF1 to TF2.6 I run into trouble.将我的代码从 TF1 调整到 TF2.6 我遇到了麻烦。 I am trying to add some custom layers to an inception resnet, save the model, and then load and run it.我正在尝试向 inception resnet 添加一些自定义层,保存 model,然后加载并运行它。

from tensorflow.keras.layers import Dense                                                                                                                       
from tensorflow.keras.models import Model                                                                                                                       
from tensorflow.keras.applications.inception_resnet_v2 import InceptionResNetV2                                                                                 
from tensorflow.keras.layers import Dense, GlobalAveragePooling2D                                                                                               
import tensorflow as tf                                                                                                                                         
import numpy as np                                                                                                                                              
from PIL import Image                                                                                                                                           
                                                                                                                                                                
export_path = "./save_test"                                                                                                                                     
                                                                                                                                                                
# Get model without top and add two layers                                                                                                                      
base_model = InceptionResNetV2(weights='imagenet', input_tensor=None, include_top=False)                                                                        
out = base_model.output                                                                                                                                         
out = GlobalAveragePooling2D()(out)                                                                                                                             
predictions = Dense(7, activation='softmax', name="output")(out)                                                                                                
                                                                                                                                                                
# Make new model using inputs from base model and custom outputs                                                                                                
model = Model(inputs=base_model.input, outputs=[predictions])                                                                                                   
                                                                                                                                                                
# save model                                                                                                                                                    
tf.saved_model.save(model, export_path)                                                                                                                         
                                                                                                                                                                
# load model and run                                                                                                                                            
with tf.compat.v1.Session(graph=tf.Graph()) as sess:                                                                                                            
    tf.compat.v1.saved_model.loader.load(sess, ['serve'], export_path)                                                                                          
    graph = tf.compat.v1.get_default_graph()                                                                                                                    
                                                                                                                                                                
    img = Image.new('RGB', (299, 299))                                                                                                                          
    x = tf.keras.preprocessing.image.img_to_array(img)                                                                                                          
    x = np.expand_dims(x, axis=0)                                                                                                                               
    x = x[..., :3]                                                                                                                                              
    x /= 255.0                                                                                                                                                  
    x = (x - 0.5) * 2.0                                                                                                                                         
                                                                                                                                                                
    y_pred = sess.run('output/Softmax:0', feed_dict={'serving_default_input_1:0': x})                                                                           

Error: KeyError: "The name 'output/Softmax:0' refers to a Tensor which does not exist. The operation, 'output/Softmax', does not exist in the graph."错误: KeyError: "The name 'output/Softmax:0' refers to a Tensor which does not exist. The operation, 'output/Softmax', does not exist in the graph."

What I don't understand: predictions.name is 'output/Softmax:0' , but graph.get_tensor_by_name('output/Softmax:0') tells me it does not exist!我不明白的是: predictions.name'output/Softmax:0' ,但graph.get_tensor_by_name('output/Softmax:0')告诉我它不存在!

Note: I am aware that I can save and load with TF2's tf.keras.models.save and tf.keras.models.load_model and then run the model with model(x) .注意:我知道我可以使用 TF2 的tf.keras.models.savetf.keras.models.load_model保存和加载,然后使用model(x)运行 Z20F35E630DAF44DBFA4C3F6x85。 However, in my application I have multiple models in memory and I have found that the inference takes much longer than in my TF1 code using the session object.但是,在我的应用程序中,我在 memory 中有多个模型,并且我发现推理时间比使用session object 的 TF1 代码要长得多。 I would therefore like to use the TF1 approach with the session object in compatibility mode.因此,我想在兼容模式下将 TF1 方法与session object 一起使用。

How can I control the names of input/output when saving?保存时如何控制输入/输出的名称? What am I missing?我错过了什么?

If you haven't already, you could try something like this (running on TF 2.7) as I think you are referencing the wrong keys in SignatureDef :如果你还没有,你可以尝试这样的事情(在 TF 2.7 上运行),因为我认为你在SignatureDef中引用了错误的键:

from tensorflow.keras.layers import Dense                                                                                                                       
from tensorflow.keras.models import Model                                                                                                                       
from tensorflow.keras.applications.inception_resnet_v2 import InceptionResNetV2                                                                                 
from tensorflow.keras.layers import Dense, GlobalAveragePooling2D                                                                                               
import tensorflow as tf                                                                                                                                         
import numpy as np                                                                                                                                              
from PIL import Image                                                                                                                                           
                                                                                                                                                                
export_path = "./save_test"                                                                                                                                     
                                                                                                                                                                
base_model = InceptionResNetV2(weights='imagenet', input_tensor=None, include_top=False)                                                                        
out = base_model.output                                                                                                                                         
out = GlobalAveragePooling2D()(out)                                                                                                                             
predictions = Dense(7, activation='softmax', name="output")(out)                                                                                                
model = Model(inputs=base_model.input, outputs=[predictions])                                                                                                   
                                                                                                                                                              
tf.saved_model.save(model, export_path)

with tf.compat.v1.Session(graph=tf.Graph()) as sess:                                                                                                            
    meta_graph = tf.compat.v1.saved_model.loader.load(sess, ["serve"], export_path)
    sig_def = meta_graph.signature_def[tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY]
    input_key = list(dict(sig_def.inputs).keys())[0]
    input_name = sig_def.inputs[input_key].name
    output_name = sig_def.outputs['output'].name
    img = Image.new('RGB', (299, 299))                                                                                                                          
    x = tf.keras.preprocessing.image.img_to_array(img)                                                                                                          
    x = np.expand_dims(x, axis=0)                                                                                                                               
    x = x[..., :3]                                                                                                                                              
    x /= 255.0                                                                                                                                                  
    x = (x - 0.5) * 2.0   
    y_pred = sess.run(output_name, feed_dict={input_name: x})        
    print(y_pred)  
INFO:tensorflow:Restoring parameters from ./save_test/variables/variables
[[0.14001141 0.13356228 0.14509581 0.22432518 0.16313255 0.11899492
  0.07487784]]

You could also take a look at the SignatureDef for input and output information:您还可以查看SignatureDef的输入和 output 信息:

print(meta_graph.signature_def)
{'serving_default': inputs {
  key: "input_2"
  value {
    name: "serving_default_input_2:0"
    dtype: DT_FLOAT
    tensor_shape {
      dim {
        size: -1
      }
      dim {
        size: -1
      }
      dim {
        size: -1
      }
      dim {
        size: 3
      }
    }
  }
}
outputs {
  key: "output"
  value {
    name: "StatefulPartitionedCall:0"
    dtype: DT_FLOAT
    tensor_shape {
      dim {
        size: -1
      }
      dim {
        size: 7
      }
    }
  }
}
method_name: "tensorflow/serving/predict"
, '__saved_model_init_op': outputs {
  key: "__saved_model_init_op"
  value {
    name: "NoOp"
    tensor_shape {
      unknown_rank: true
    }
  }
}
}

If you remove the first layer of your base_model and add a new Input layer, you can use static key names like this:如果删除base_model的第一层并添加新的Input层,则可以使用 static 键名,如下所示:

from tensorflow.keras.layers import Dense                                                                                                                       
from tensorflow.keras.models import Model                                                                                                                       
from tensorflow.keras.applications.inception_resnet_v2 import InceptionResNetV2                                                                                 
from tensorflow.keras.layers import Dense, GlobalAveragePooling2D                                                                                               
import tensorflow as tf                                                                                                                                         
import numpy as np                                                                                                                                              
from PIL import Image                                                                                                                                           
                                                                                                                                                                
export_path = "./save_test"                                                                                                                                     
                                                                                                                                                                
base_model = InceptionResNetV2(weights='imagenet', input_tensor=None, include_top=False)
base_model.layers.pop(0)
new_input = tf.keras.layers.Input(shape=(299,299,3), name='input')
out = base_model(new_input)                                                                                                                                        
out = GlobalAveragePooling2D()(out)                                                                                                                             
predictions = Dense(7, activation='softmax', name="output")(out) 

model = Model(inputs=new_input, outputs=[predictions])                                                                                                   
tf.saved_model.save(model, export_path)

with tf.compat.v1.Session(graph=tf.Graph()) as sess:                                                                                                            
    meta_graph = tf.compat.v1.saved_model.loader.load(sess, ["serve"], export_path)
    sig_def = meta_graph.signature_def[tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY]
    input_name = sig_def.inputs['input'].name
    output_name = sig_def.outputs['output'].name
    img = Image.new('RGB', (299, 299))                                                                                                                          
    x = tf.keras.preprocessing.image.img_to_array(img)                                                                                                          
    x = np.expand_dims(x, axis=0)                                                                                                                               
    x = x[..., :3]                                                                                                                                              
    x /= 255.0                                                                                                                                                  
    x = (x - 0.5) * 2.0   
    y_pred = sess.run(output_name, feed_dict={input_name: x})        
    print(y_pred)   
INFO:tensorflow:Restoring parameters from ./save_test/variables/variables
[[0.21079363 0.10773096 0.07287834 0.06983061 0.10538215 0.09172108
  0.34166315]]

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 TensorFlow 使用已保存的 model 进行推理 - TensorFlow inference using saved model 如何与 tensorflow 保存的 model 预测器并行进行推理? - How to do inference in parallel with tensorflow saved model predictors? 如何部署使用 export_saved_model 保存的 TensorFlow 模型 - How to deploy a TensorFlow model that is saved using export_saved_model 如何使用Tensorflow估算器对保存的模型进行评分? - How to score model saved using Tensorflow estimator? Tensorflow 2.0:如何在使用 tf.saved_model 时更改输出签名 - Tensorflow 2.0: How to change the output signature while using tf.saved_model 如何使用saved_model API定期保存张量流模型? - How to periodically save tensorflow model using saved_model API? 如何加速 Tensorflow 2 keras model 进行推理? - How to speed up Tensorflow 2 keras model for inference? 自定义音频 Tensorflow 模型 - 如何运行推理 - Custom audio Tensorflow model - how to run inference 在Tensorflow中如何冻结已保存的模型 - In Tensorflow how to freeze saved model 如何使用来自 Google AutoML Vision Classification 的 TensorFlow Frozen GraphDef (single saved_model.pb) 进行推理和迁移学习 - How to do Inference and Transfer Learning with TensorFlow Frozen GraphDef (single saved_model.pb) from Google AutoML Vision Classification
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM