简体   繁体   English

云ML模型预测

[英]Cloud ML model prediction

I have deployed tensorflow saved Model in cloud ML for text classification with the following, 我已经在云ML中部署了将tensorflow保存的模型用于文本分类,内容如下:

    input_x = graph.get_tensor_by_name('input_x:0')
    keep_prob = graph.get_tensor_by_name('keep_prob:0')
    predictions = graph.get_tensor_by_name('softmax/predictions:0')

feed_dict = {input_x: x_test, batch_size: 8, sequence_length: x_lengths,  keep_prob: 1.0}

Its deployed no error. 其部署没有错误。 I have a csv file to predict. 我有一个csv文件可以预测。 --csv file-- --csv文件-

"the test is completed"
"the test2 is done"

Getting only errors. 只得到错误。 How to convert this to json for the model I trained, to batch predict in cloud ML?? 如何将其转换为我训练的模型的json,以在云ML中批量预测?

saved_model_cli - Info saved_model_cli-信息

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['serving_default']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['batch_size'] tensor_info:
        dtype: DT_INT32
        shape: ()
        name: batch_size:0
    inputs['input_x'] tensor_info:
        dtype: DT_INT32
        shape: (-1, 25)
        name: input_x:0
    inputs['keep_prob'] tensor_info:
        dtype: DT_FLOAT
        shape: ()
        name: keep_prob:0
    inputs['sequence_length'] tensor_info:
        dtype: DT_INT32
        shape: (-1)
        name: sequence_length:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['predictions'] tensor_info:
        dtype: DT_INT64
        shape: (-1)
        name: softmax/predictions:0
  Method name is: tensorflow/serving/predict

Currently i converted the csv to Json, used to predict: 目前,我将csv转换为Json,用于预测:

{"sequence_length": 25, "batch_size": 1, "keep_prob": 1.0, "input_x": [1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 10, 11, 12, 13, 14, 15, 1, 16, 12, 13, 14, 17, 18, 19, 20]}

Exception: 例外:

 Exception during running the graph: Cannot feed value of shape (1,) for Tensor u\'keep_prob:0\', which has shape \'()\' (Error code: 2)\n'

This model appears to need several changes to be directly servable. 该模型似乎需要进行一些更改才能直接使用。 One requirement of the service is that each of the inputs have an unspecified outer dimension which is interpreted as the "batch" dimension. 服务的一项要求是,每个输入都具有未指定的外部尺寸,该尺寸被解释为“批”尺寸。 The inputs input_x and sequence_length meet this requirement, but batch_size and keep_prob do not. 输入input_xsequence_length满足此要求,但batch_sizekeep_prob不满足。

The service dynamically builds batches which is why it needs to be variable length. 该服务动态生成批处理,这就是为什么它需要可变长度的原因。 Thus, have an input called batch_size is going to be problematic, since the service doesn't know that it should be setting that input. 因此,有一个名为batch_size的输入将是有问题的,因为该服务不知道它应该设置该输入。 Instead, it builds up a batch and sends it to TensorFlow. 而是建立一个批处理并将其发送到TensorFlow。 TensorFlow already knows the batch size because it's the value of the outer dimension of inputs such as input_x . TensorFlow已经知道批处理大小,因为它是输入外部尺寸(例如input_x

Instead of using batch_size as an input, it's preferable to do something like: 与其使用batch_size作为输入,不如执行以下操作:

batch_size = tf.shape(input_x)[0]

Although I will note that even the need for that is usually fairly rare in practice. 尽管我会指出,实际上实际上甚至很少需要这样做。 Things usually "just work", because input_x is used in some sort of operation like a matrix multiply or convolution which will handle things fine without explicitly knowing the batch size. 事情通常是“正常的”,因为在某种运算中使用了input_x ,例如矩阵乘法或卷积,可以在不明确知道批处理大小的情况下很好地处理事情。

Finally, there is keep_prob , which usually indicates there is a dropout layer in the model. 最后,还有keep_prob ,它通常指示模型中存在一个退出层。 Even though you can hard code this to 1.0, it's usually recommended that you just remove dropout layers altogether for serving. 即使您可以将此代码硬编码为1.0,通常还是建议您完全删除掉落层以进行投放。 Basically, when you export the model, you actually build a different graph than for training. 基本上,当您导出模型时,您实际上会构建训练不同的图。 This is exemplified in this sample . 此示例中对此进行了举例说明。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM