繁体   English   中英

人工智能平台云预测不工作,但本地预测工作

[英]ai-platform cloud predict not working but local predict works

我已经从 ai-platform-samples 模板成功地训练和本地预测了 DNNLinearCombinedClassifier。

当我运行pip freeze| grep tensorflow 我本地 PC 上的pip freeze| grep tensorflow

tensorflow==1.15.0
tensorflow-datasets==1.2.0
tensorflow-estimator==1.15.1
tensorflow-hub==0.6.0
tensorflow-io==0.8.0
tensorflow-metadata==0.15.1
tensorflow-model-analysis==0.15.4
tensorflow-probability==0.8.0
tensorflow-serving-api==1.15.0

当我为保存的模型运行saved_model_cli show ,我得到以下输出:

The given SavedModel SignatureDef contains the following input(s):
  inputs['Sector'] tensor_info:
      dtype: DT_STRING
      shape: (-1)
      name: Placeholder_2:0
  inputs['announcement_type_simple'] tensor_info:
      dtype: DT_STRING
      shape: (-1)
      name: Placeholder_1:0
  inputs['market_cap'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1)
      name: Placeholder_3:0
  inputs['sens_content'] tensor_info:
      dtype: DT_STRING
      shape: (-1)
      name: Placeholder:0
The given SavedModel SignatureDef contains the following output(s):
  outputs['all_class_ids'] tensor_info:
      dtype: DT_INT32
      shape: (-1, 3)
      name: head/predictions/Tile:0
  outputs['all_classes'] tensor_info:
      dtype: DT_STRING
      shape: (-1, 3)
      name: head/predictions/Tile_1:0
  outputs['class_ids'] tensor_info:
      dtype: DT_INT64
      shape: (-1, 1)
      name: head/predictions/ExpandDims_2:0
  outputs['classes'] tensor_info:
      dtype: DT_STRING
      shape: (-1, 1)
      name: head/predictions/str_classes:0
  outputs['logits'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 3)
      name: dnn/logits/BiasAdd:0
  outputs['probabilities'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 3)
      name: head/predictions/probabilities:0
Method name is: tensorflow/serving/predict

输入与我输入到我的 json 文件中的内容一致,如下所示:

{"sens_content": "RFG 201411130005A Trading Statement Rhodes Food Group", "announcement_type_simple": "trade statement", "Sector": "Consumer, Non-cyclical","market_cap": 4377615219.88}

该模型使用gcloud ai-platform local predict进行gcloud ai-platform local predict

当我运行gcloud ai-platform predict --model=${MODEL_NAME} --version=${MODEL_VERSION} --json-instances=data/new-data.json --verbosity debug --log-http它会创建以下内容邮政 :

==== request start ====
uri: https://ml.googleapis.com/v1/projects/simon-teraflow-project/models/tensorflow_sens1/versions/v3:predict
method: POST
== headers start ==
Authorization: --- Token Redacted ---
Content-Type: application/json
user-agent: gcloud/270.0.0 command/gcloud.ai-platform.predict invocation-id/f01f2f4b8c494082abfc38e19499019b environment/GCE environment-version/None interactive/True from-script/False python/2.7.13 term/xterm (Linux 4.9.0-11-amd64)
== headers end ==
== body start ==
{"instances": [{"Sector": "Consumer, Non-cyclical", "announcement_type_simple": "trade statement", "market_cap": 4377615219.88, "sens_content": "RFG 201411130005A Trading Statement Rhodes Food Group"}]}
== body end ==
==== request end ====

可以看到输入与需要的一致。 以下是回应:

Traceback (most recent call last):
  File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/calliope/cli.py", line 984, in Execute
    resources = calliope_command.Run(cli=self, args=args)
  File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/calliope/backend.py", line 798, in Run
    resources = command_instance.Run(args)
  File "/usr/lib/google-cloud-sdk/lib/surface/ai_platform/predict.py", line 110, in Run
    signature_name=args.signature_name)
  File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/api_lib/ml_engine/predict.py", line 77, in Predict
    response_body)
HttpRequestFailError: HTTP request failed. Response: {
  "error": {
    "code": 400,
    "message": "Bad Request",
    "status": "INVALID_ARGUMENT"
  }
}

ERROR: (gcloud.ai-platform.predict) HTTP request failed. Response: {
  "error": {
    "code": 400,
    "message": "Bad Request",
    "status": "INVALID_ARGUMENT"
  }
} 

在 ai 平台上试过同样的事情“测试你的模型”。 相同的结果:
在 AI 平台 gui 上进行预测在 AI 平台 gui 上进行预测

我已经检查过运行时是1.15 ,这与本地预测一致,也与 Python 版本一致。

我已经搜索过类似的案例,但一无所获。 任何建议将不胜感激。

您可以尝试以下操作:

1)在本地保存您的模型,您可以使用适合您的模式的以下代码段 [1] 示例

2) 使用 Docker 进行测试

3) 将模型部署到 GCP 并向模型 [2](适应您的模型)发出请求,使用 gcloud 命令而不是 GCP UI。

[1]

========Code snippet===============
MODEL_NAME = <MODEL NAME>
VERSION = <MODEL VERSION>
SERVE_PATH = './models/{}/{}'.format(MODEL_NAME, VERSION)

import tensorflow as tf
import tensorflow_hub as hub

use_model = "https://tfhub.dev/google/<MODEL NAME>/<MODEL VERSION>"

with tf.Graph().as_default():
  module = hub.Module(use_model, name=MODEL_NAME)
  text = tf.placeholder(tf.string, [None])
  embedding = module(text)

  init_op = tf.group([tf.global_variables_initializer(), tf.tables_initializer()])

  with tf.Session() as session:
    session.run(init_op)

    tf.saved_model.simple_save(
        session,
        SERVE_PATH,
        inputs = {"text": text},
        outputs = {"embedding": embedding},
        legacy_init_op = tf.tables_initializer()
    )    
========/ Code snippet===============

[2]

Replace <Project_name>, <model_name>, <bucket_name> and <model_version>

    $ gcloud ai-platform models create <model_name> --project <Project_name>
    $ gcloud beta ai-platform versions create v1 --project <Project_name> --model <model_name> --origin=/location/of/model/dir/<model_name>/<model_version> --staging-bucket gs://<bucket_name> --runtime-version=1.15 --machine-type=n1-standard-8
    $ echo '{"text": "cat"}' > instances.json
    $ gcloud ai-platform predict --project <Project_name> --model <model_name> --version v1 --json-instances=instances.json
    $ curl -X POST -v -k -H "Content-Type: application/json" -d '{"instances": [{"text": "cat"}]}'  -H "Authorization: Bearer `gcloud auth print-access-token`" "https://ml.googleapis.com/v1/projects/<Project_name>/models/<model_name>/versions/v1:predict"

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM