简体   繁体   English

Tensorflow服务:Rest API返回“格式错误的请求”错误

[英]Tensorflow Serving: Rest API returns “Malformed request” error

Tensorflow Serving server (run with docker) responds to my GET (and POST) requests with this: Tensorflow服务服务器(使用docker运行)响应我的GET(和POST)请求:

{ "error": "Malformed request: POST /v1/models/saved_model/" }

Precisely the same problem was already reported but never solved (supposedly, this is a StackOverflow kind of question, not a GitHub issue): 确切地说,同样的问题已经报告但从未解决过(据说,这是一个StackOverflow问题,而不是GitHub问题):

https://github.com/tensorflow/serving/issues/1085 https://github.com/tensorflow/serving/issues/1085

https://github.com/tensorflow/serving/issues/1095 https://github.com/tensorflow/serving/issues/1095

Any ideas? 有任何想法吗? Thank you very much. 非常感谢你。

I verified that this does not work pre-v12 and does indeed work post-v12. 我确认这在v12之前不起作用,确实在v12之后工作。

> docker run -it -p 127.0.0.1:9000:8500 -p 127.0.0.1:9009:8501 -v /models/55:/models/55 -e MODEL_NAME=55 --rm tensorflow/serving
> curl http://localhost:9009/v1/models/55
   { "error": "Malformed request: GET /v1/models/55" }

Now try with v12: 现在尝试使用v12:

> docker run -it -p 127.0.0.1:9000:8500 -p 127.0.0.1:9009:8501 -v /models/55:/models/55 -e MODEL_NAME=55 --rm tensorflow/serving:1.12.0
> curl http://localhost:9009/v1/models/55
{
 "model_version_status": [
  {
   "version": "1541703514",
   "state": "AVAILABLE",
   "status": {
    "error_code": "OK",
    "error_message": ""
   }
  }
 ]
}

Depends on your model, but this is what my body looks like: 取决于你的模型,但这是我的身体的样子:

{"inputs": {"text": ["Hello"]}} {“inputs”:{“text”:[“Hello”]}}

I used Postman to help me out so that it knew it was a JSON. 我用Postman来帮助我,让它知道它是一个JSON。

This is for predict API, so the url ends in ":predict" Again, that depends on what API you're trying to use. 这是针对预测API,因此网址以“:predict”结尾再次,这取决于您尝试使用的API。

There were two issues with my approach: 我的方法有两个问题:

1) The status check request wasn't supported in my Tensorflow_model_server (see https://github.com/tensorflow/serving/issues/1085 for details) 1)我的Tensorflow_model_server不支持状态检查请求(有关详细信息,请参阅https://github.com/tensorflow/serving/issues/1085

2) More importantly, when using Windows you must escape quotation marks in JSON . 2)更重要的是,在使用Windows时,必须使用JSON中的引号 So instead of: 所以代替:

curl -XPOST http://localhost:8501/v1/models/saved_model:predict -d "{"instances":[{"features":[1,1,1,1,1,1,1,1,1,1]}]}"

I should have used this: 我应该用这个:

curl -XPOST http://localhost:8501/v1/models/saved_model:predict -d "{\"instances\":[{\"features\":[1,1,1,1,1,1,1,1,1,1]}]}"

Model status API is only supported in master branch. 模型状态API仅在主分支中受支持。 There is no TF serving release that supports it yet (the API is slated for upcoming 1.12 release). 目前还没有支持它的TF服务版本(API将在即将发布的1.12版本中发布)。 You can use the nightly docker image (tensorflow/serving:nightly) to test on master branch builds. 您可以使用夜间docker镜像(tensorflow / serving:nightly)来测试主分支构建。

This solution gived by netf in issue:1128 in tensorflow/serving . 这个解决方案由netf 提出:张量流/服务中的1128 I already try this solution, it's done and i can get the model status. 我已经尝试过这个解决方案,它已经完成,我可以获得模型状态。 Getting Model status img (this is the img for model status demo). 获取模型状态img (这是模型状态演示的img)。

Hope I can help you. 希望我能帮助你。

If you not clear the master branch builds, you can contact me. 如果您没有清除master分支构建,您可以联系我。

I can give your instruction. 我可以给你指示。

Email:mizeshuang@gmail.com 电子邮件:mizeshuang@gmail.com

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM