简体   繁体   English

测试TF服务模型失败,字节为字符串,字符串为字节混淆

[英]Testing TF serving model fails with bytes as strings and strings as bytes confusion

I'm having a problem serving my text classification model on Tensorflow 1.12 . 我在Tensorflow 1.12上提供文本分类模型时Tensorflow 1.12 I'm using tf.estimator.inputs.pandas_input_fn to read in my data, and tf.estimator.DNNClassifier to train/evaluate. 我正在使用tf.estimator.inputs.pandas_input_fn读取我的数据,并使用tf.estimator.DNNClassifier来训练/评估。 I'd then like to serve my model. 然后我想服务我的模特。 (Apologies in advance, it's tough to provide a full working example here, but it's very much like the example TF provides at https://www.tensorflow.org/api_docs/python/tf/estimator/DNNClassifier ) (事先道歉,这里很难提供一个完整的工作示例,但它非常类似于TF提供的示例https://www.tensorflow.org/api_docs/python/tf/estimator/DNNClassifier

I'm currently saving my model with ... 我正在用...保存我的模型

...
estimator.export_savedmodel("./TEST_SERVING/", self.serving_input_receiver_fn, strip_default_attrs=True)
...
def serving_input_receiver_fn(self):
      """An input receiver that expects a serialized tf.Example."""

      # feature spec dictionary  determines our input parameters for the model
      feature_spec = {
          'Headline': tf.VarLenFeature(dtype=tf.string),
          'Description': tf.VarLenFeature(dtype=tf.string)
      }

      # the inputs will be initially fed as strings with data serialized by
      # Google ProtoBuffers
      serialized_tf_example = tf.placeholder(
          dtype=tf.string, shape=None, name='input_example_tensor')
      receiver_tensors = {'examples': serialized_tf_example}

      # deserialize input
      features = tf.parse_example(serialized_tf_example, feature_spec)
      return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)


This actually fails to run with the error: 这实际上无法运行错误:

TypeError: Failed to convert object of type <class 'tensorflow.python.framework.sparse_tensor.SparseTensor'> to Tensor. Contents: SparseTensor(indices=Tensor("ParseExample/ParseExample:0", shape=(?, 2), 
dtype=int64), values=Tensor("ParseExample/ParseExample:2", shape=(?,), dtype=string), dense_shape=Tensor("ParseExample/ParseExample:4", shape=(2,), dtype=int64)). Consider casting elements to a supported type.

I tried to save a second way doing: 我试图保存第二种方式:

def serving_input_receiver_fn(self):
  """Build the serving inputs."""
  INPUT_COLUMNS = ["Headline","Description"]
  inputs = {}
  for feat in INPUT_COLUMNS:
    inputs[feat] = tf.placeholder(shape=[None], dtype=tf.string, name=feat)
  return tf.estimator.export.ServingInputReceiver(inputs, inputs)

This actually works, until I try testing it with the saved_model_cli . 这实际上有效,直到我尝试使用saved_model_cli进行测试。 Some output for saved_model_cli show --all --dir TEST_SERVING/1553879255/ : saved_model_cli show --all --dir TEST_SERVING/1553879255/一些输出saved_model_cli show --all --dir TEST_SERVING/1553879255/

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['predict']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['Description'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: Description:0
    inputs['Headline'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: Headline:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['class_ids'] tensor_info:
        dtype: DT_INT64
        shape: (-1, 1)
        name: dnn/head/predictions/ExpandDims:0
    outputs['classes'] tensor_info:
        dtype: DT_STRING
        shape: (-1, 1)
        name: dnn/head/predictions/str_classes:0
    outputs['logits'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 3)
        name: dnn/logits/BiasAdd:0
    outputs['probabilities'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 3)
        name: dnn/head/predictions/probabilities:0
  Method name is: tensorflow/serving/predict

But now I can't seem to test it. 但现在我似乎无法测试它。

>>> saved_model_cli run --dir TEST_SERVING/1553879255/ --tag_set serve --signature_def predict --input_examples 'inputs=[{"Description":["What is going on"],"Headline":["Help me"]}]'
Traceback (most recent call last):
 ...
  File "/Users/Josh/miniconda3/envs/python36/lib/python3.6/site-packages/tensorflow/python/tools/saved_model_cli.py", line 489, in _create_example_string
    feature_list)
TypeError: 'What is going on' has type str, but expected one of: bytes

Ok, lets turn it into a bytes object by changing to b["What is going on"] and b["Help me"] ... 好吧,让我们把它变成一个字节对象,改成b["What is going on"]b["Help me"] ......

ValueError: Type <class 'bytes'> for value b'What is going on' is not supported for tf.train.Feature.

Any ideas/thoughts?? 任何想法/想法? Thanks! 谢谢!

Ok, so eventually I found the answer, quoted in TensorFlow: how to export estimator using TensorHub module? 好吧,最后我找到了答案,在TensorFlow中引用:如何使用TensorHub模块导出估算器?

The problem was with serialization stuff I don't really understand. 问题是序列化的东西,我真的不明白。 The solution allows to pass raw strings to tf.estimator.export.build_raw_serving_input_receiver_fn instead. 该解决方案允许将原始字符串传递给tf.estimator.export.build_raw_serving_input_receiver_fn

My saving funciton now looks like this: 我的保存功能现在看起来像这样:

  def save_serving_model(self,estimator):
      feature_placeholder = {'Headline': tf.placeholder('string', [1], name='headline_placeholder'),
      'Description': tf.placeholder('string', [1], name='description_placeholder')}
      serving_input_fn = tf.estimator.export.build_raw_serving_input_receiver_fn(feature_placeholder)

      estimator.export_savedmodel("TEST_SERVING/", serving_input_fn)

where using the saved_model_cli works. 使用saved_model_cli有效。 Ie: 即:

saved_model_cli run --dir /path/to/model/ --tag_set serve --signature_def predict --input_exprs="Headline=['Finally, it works'];Description=['Yay, it works']" 

Result for output key class_ids:
[[2]]
Result for output key classes:
[[b'2']]
Result for output key logits:
[[-0.56755465  0.31625098  0.39260274]]
Result for output key probabilities:
[[0.16577701 0.40119565 0.4330274 ]]

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM