[英]Use tf.saved_model to predict multiple input vectors (tensorflow 2.0)
I trained an estimator object for prediction.我训练了一个用于预测的估计器对象。 But as you may know, estimator.predict
restores parameters every time it runs, which is really slow.但是你可能知道, estimator.predict
每次运行都要恢复参数,这真的很慢。 So I followed this guide to speed it up.所以我按照本指南加快了速度。 Since I'm using tensorflow 2.0, the tf.contrib.predictor
API recommended in this guide is no longer available, so I resorted to the saved_model
API which is the official way of loading models.由于我使用的是 tensorflow 2.0,本指南中推荐的tf.contrib.predictor
API 不再可用,因此我求助于saved_model
API,这是加载模型的官方方式。
Here's the code for saving the estimator to a saved_model.下面是将估算器保存到 saved_model 的代码。 (I only have 5 features for now) (我现在只有5个功能)
serving_input_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(
tf.feature_column.make_parse_example_spec([tf.feature_column.numeric_column(str(x)) for x in range(1,6)]))
my_estimator.export_saved_model('saved_model',serving_input_fn)
output:输出:
INFO:tensorflow:Done calling model_fn.
INFO:tensorflow:Signatures INCLUDED in export for Classify: None
INFO:tensorflow:Signatures INCLUDED in export for Regress: ['serving_default', 'regression']
INFO:tensorflow:Signatures INCLUDED in export for Predict: ['predict']
INFO:tensorflow:Signatures INCLUDED in export for Train: None
INFO:tensorflow:Signatures INCLUDED in export for Eval: None
INFO:tensorflow:Restoring parameters from ./output\model.ckpt-100000
INFO:tensorflow:Assets added to graph.
INFO:tensorflow:No assets to write.
INFO:tensorflow:SavedModel written to: saved_model\temp-b'1579582279'\saved_model.pb
Following the official guide for predicting , I called the predict
signature on tf.Example
built with my input data:按照官方的预测指南,我调用了使用我的输入数据构建的tf.Example
上的predict
签名:
example = tf.train.Example()
example.features.feature["1"].float_list.value.append(1) #note here the float_list.value can take multiple values
example.features.feature["2"].float_list.value.append(1)
example.features.feature["3"].float_list.value.append(1)
example.features.feature["4"].float_list.value.append(1)
example.features.feature["5"].float_list.value.append(1)
and make prediction with并做出预测
my_model=tf.saved_model.load('saved_model/1579582279')
my_prediction=my_model.signatures["predict"](examples=tf.constant([example.SerializeToString()]))
While this works fine.虽然这很好用。 When I construct tf.example
with a list of values for each feature.当我使用每个功能的值列表构建tf.example
时。 And try to predict with the same code并尝试使用相同的代码进行预测
example = tf.train.Example()
example.features.feature["1"].float_list.value.extend([1,2])
example.features.feature["2"].float_list.value.extend([1,2])
example.features.feature["3"].float_list.value.extend([1,2])
example.features.feature["4"].float_list.value.extend([1,2])
example.features.feature["5"].float_list.value.extend([1,2])
my_prediction=my_model.signatures["predict"](examples=tf.constant([example.SerializeToString()]))
It gives me error:它给我错误:
InvalidArgumentError: Name: <unknown>, Key: 2, Index: 0. Number of float values != expected. Values size: 2 but output shape: [1]
[[node ParseExample/ParseExample (defined at c:\users\i354164\appdata\local\programs\python\python36\lib\site-packages\tensorflow_core\python\framework\ops.py:1751) ]] [Op:__inference_pruned_2040]
Function call stack:
pruned
My question is: how to export/load the saved_model so that it can take tf.Example
with more than one input?我的问题是:如何导出/加载 saved_model 以便它可以采用具有多个输入的tf.Example
?
It seems that you need to adjust the number of positional arguments after loading the model.看来您需要在加载模型后调整位置参数的数量。
how to input multi features for tensorflow model inference 如何为张量流模型推理输入多特征
infer._num_positional_args = 2
infer(tf.constant(x1), tf.constant(x2))
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.