繁体   English   中英

AWS Sagemaker Model 进行预测器调用时出错

[英]AWS Sagemaker Model Error when making Predictor Call

我正在尝试进行 tensorFlow 预测器调用,但我收到 ModelError - 502 Bad Gateway。 我似乎无法追溯到导致此错误的服务器中发生的事情。 此 model 部署在 ml.c4.xlarge 实例上。

根据评论的要求,这里是用于部署 model 的代码:

#Import Tensorflow Model
from sagemaker.tensorflow.serving import Model
sagemaker_model = Model(model_data='s3://' + sagemaker_session.default_bucket() + '/Scikit-keras-NLP-pipeline-examplet/train/example.tar.gz',
                        role=role,
                        sagemaker_session=sagemaker_session)

scikit_learn_inference_model = sklearn_preprocessor.create_model() 

#Build Inference Pipeline
sm_model = PipelineModel(
    name=model_name, 
    role=role, 
    models=[
        scikit_learn_inference_model, 
        sagemaker_model],
    sagemaker_session=sagemaker_session)

sm_model.deploy(initial_instance_count=1, instance_type='ml.c4.xlarge', endpoint_name=endpoint_name)

供参考:这是 Sagemaker 笔记本中的代码,该笔记本使用保存在 S3 中的 .csv 文件尝试进行预测:

from sagemaker.predictor import json_serializer, csv_serializer, json_deserializer, RealTimePredictor
from sagemaker.content_types import CONTENT_TYPE_CSV, CONTENT_TYPE_JSON

predictor = RealTimePredictor(
    endpoint='example',
    sagemaker_session=sagemaker_session,
    serializer=csv_serializer,
    content_type=CONTENT_TYPE_CSV,
    accept=CONTENT_TYPE_JSON)

from sklearn.pipeline import Pipeline
from sklearn.preprocessing import Binarizer, StandardScaler, OneHotEncoder
from sklearn.impute import SimpleImputer

column_names = ['CRIM', 'ZN', 'INDUS', 'CHAS', 'NOX', 'RM', 'AGE', 'DIS', 'RAD', 'TAX', 'PTRATIO', 'B', 'LSTAT', 'MEDV']
dff = pd.read_csv('housing.csv',delimiter=r"\s+", names=column_names)
dff.drop('MEDV',axis=1,inplace=True)

#String
x = '0.00632,18.0,2.31,0,0.538,6.575,65.2,4.09,1,296.0,15.3,396.9,4.98'

#DataFrame
y= dff.head(1)

#Array
z = np.array([[0.00632,18.0,2.31,0,0.538,6.575,65.2,4.09,1,296.0,15.3,396.9,4.98]])

print(predictor.predict(z))

这是供参考的完整错误:

---------------------------------------------------------------------------
ModelError                                Traceback (most recent call last)
<ipython-input-33-1a95d0a95a02> in <module>()
----> 1 print(predictor.predict(z))

~/anaconda3/envs/tensorflow_p36/lib/python3.6/site-packages/sagemaker/predictor.py in predict(self, data, initial_args, target_model)
    108 
    109         request_args = self._create_request_args(data, initial_args, target_model)
--> 110         response = self.sagemaker_session.sagemaker_runtime_client.invoke_endpoint(**request_args)
    111         return self._handle_response(response)
    112 

~/anaconda3/envs/tensorflow_p36/lib/python3.6/site-packages/botocore/client.py in _api_call(self, *args, **kwargs)
    314                     "%s() only accepts keyword arguments." % py_operation_name)
    315             # The "self" in this scope is referring to the BaseClient.
--> 316             return self._make_api_call(operation_name, kwargs)
    317 
    318         _api_call.__name__ = str(py_operation_name)

~/anaconda3/envs/tensorflow_p36/lib/python3.6/site-packages/botocore/client.py in _make_api_call(self, operation_name, api_params)
    624             error_code = parsed_response.get("Error", {}).get("Code")
    625             error_class = self.exceptions.from_code(error_code)
--> 626             raise error_class(parsed_response, operation_name)
    627         else:
    628             return parsed_response

ModelError: An error occurred (ModelError) when calling the InvokeEndpoint operation: Received server error (502) from container-2 with message "<html>
<head><title>502 Bad Gateway</title></head>
<body>
<center><h1>502 Bad Gateway</h1></center>
<hr><center>nginx/1.16.1</center>
</body>
</html>

最后,来自 CloudWatch Logs 的片段(如果有帮助):

2020/07/21 06:31:23 [error] 33#33: *14 connect() failed (111: Connection refused) while connecting to upstream, client: 10.32.0.3, server: , request: "POST /invocations HTTP/1.1", subrequest: "/v1/models/Servo:predict", upstream: "http://127.0.0.1:22001/v1/models/Servo:predict", host: "container-2.aws.local:9553"

使用序列化程序调用/部署。

from sagemaker.serializers import IdentitySerializer
    
sm_model.deploy(initial_instance_count=1, instance_type='ml.c4.xlarge', endpoint_name=endpoint_name, serializer=IdentitySerializer)

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM