[英]AWS Sagemaker Model Error when making Predictor Call
我正在嘗試進行 tensorFlow 預測器調用,但我收到 ModelError - 502 Bad Gateway。 我似乎無法追溯到導致此錯誤的服務器中發生的事情。 此 model 部署在 ml.c4.xlarge 實例上。
根據評論的要求,這里是用於部署 model 的代碼:
#Import Tensorflow Model
from sagemaker.tensorflow.serving import Model
sagemaker_model = Model(model_data='s3://' + sagemaker_session.default_bucket() + '/Scikit-keras-NLP-pipeline-examplet/train/example.tar.gz',
role=role,
sagemaker_session=sagemaker_session)
scikit_learn_inference_model = sklearn_preprocessor.create_model()
#Build Inference Pipeline
sm_model = PipelineModel(
name=model_name,
role=role,
models=[
scikit_learn_inference_model,
sagemaker_model],
sagemaker_session=sagemaker_session)
sm_model.deploy(initial_instance_count=1, instance_type='ml.c4.xlarge', endpoint_name=endpoint_name)
供參考:這是 Sagemaker 筆記本中的代碼,該筆記本使用保存在 S3 中的 .csv 文件嘗試進行預測:
from sagemaker.predictor import json_serializer, csv_serializer, json_deserializer, RealTimePredictor
from sagemaker.content_types import CONTENT_TYPE_CSV, CONTENT_TYPE_JSON
predictor = RealTimePredictor(
endpoint='example',
sagemaker_session=sagemaker_session,
serializer=csv_serializer,
content_type=CONTENT_TYPE_CSV,
accept=CONTENT_TYPE_JSON)
from sklearn.pipeline import Pipeline
from sklearn.preprocessing import Binarizer, StandardScaler, OneHotEncoder
from sklearn.impute import SimpleImputer
column_names = ['CRIM', 'ZN', 'INDUS', 'CHAS', 'NOX', 'RM', 'AGE', 'DIS', 'RAD', 'TAX', 'PTRATIO', 'B', 'LSTAT', 'MEDV']
dff = pd.read_csv('housing.csv',delimiter=r"\s+", names=column_names)
dff.drop('MEDV',axis=1,inplace=True)
#String
x = '0.00632,18.0,2.31,0,0.538,6.575,65.2,4.09,1,296.0,15.3,396.9,4.98'
#DataFrame
y= dff.head(1)
#Array
z = np.array([[0.00632,18.0,2.31,0,0.538,6.575,65.2,4.09,1,296.0,15.3,396.9,4.98]])
print(predictor.predict(z))
這是供參考的完整錯誤:
---------------------------------------------------------------------------
ModelError Traceback (most recent call last)
<ipython-input-33-1a95d0a95a02> in <module>()
----> 1 print(predictor.predict(z))
~/anaconda3/envs/tensorflow_p36/lib/python3.6/site-packages/sagemaker/predictor.py in predict(self, data, initial_args, target_model)
108
109 request_args = self._create_request_args(data, initial_args, target_model)
--> 110 response = self.sagemaker_session.sagemaker_runtime_client.invoke_endpoint(**request_args)
111 return self._handle_response(response)
112
~/anaconda3/envs/tensorflow_p36/lib/python3.6/site-packages/botocore/client.py in _api_call(self, *args, **kwargs)
314 "%s() only accepts keyword arguments." % py_operation_name)
315 # The "self" in this scope is referring to the BaseClient.
--> 316 return self._make_api_call(operation_name, kwargs)
317
318 _api_call.__name__ = str(py_operation_name)
~/anaconda3/envs/tensorflow_p36/lib/python3.6/site-packages/botocore/client.py in _make_api_call(self, operation_name, api_params)
624 error_code = parsed_response.get("Error", {}).get("Code")
625 error_class = self.exceptions.from_code(error_code)
--> 626 raise error_class(parsed_response, operation_name)
627 else:
628 return parsed_response
ModelError: An error occurred (ModelError) when calling the InvokeEndpoint operation: Received server error (502) from container-2 with message "<html>
<head><title>502 Bad Gateway</title></head>
<body>
<center><h1>502 Bad Gateway</h1></center>
<hr><center>nginx/1.16.1</center>
</body>
</html>
最后,來自 CloudWatch Logs 的片段(如果有幫助):
2020/07/21 06:31:23 [error] 33#33: *14 connect() failed (111: Connection refused) while connecting to upstream, client: 10.32.0.3, server: , request: "POST /invocations HTTP/1.1", subrequest: "/v1/models/Servo:predict", upstream: "http://127.0.0.1:22001/v1/models/Servo:predict", host: "container-2.aws.local:9553"
使用序列化程序調用/部署。
from sagemaker.serializers import IdentitySerializer
sm_model.deploy(initial_instance_count=1, instance_type='ml.c4.xlarge', endpoint_name=endpoint_name, serializer=IdentitySerializer)
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.