简体   繁体   English

无法加载我通过 sagemaker 笔记本部署的本地预训练 model

[英]cannot load locally pretrained model that I deployed via sagemaker notebook

Trying to deploy locally pretrained model via sagemaker to make a endpoint and use it I have been learning AI/ML field lately, and I am such a noob.尝试通过 sagemaker 部署本地预训练的 model 来制作端点并使用它我最近一直在学习 AI/ML 领域,我就是个菜鸟。 Need a help.需要帮助。

I deployed a model我部署了一个 model

from sagemaker.pytorch import PyTorchModel

pytorch_model = PyTorchModel(model_data='model.tar.gz',
                             role=role,
                             entry_point='inference.py',
                             framework_version="1.11.0",
                             py_version="py38")

predictor = 
pytorch_model.deploy(instance_type='ml.g4dn.xlarge', 
initial_instance_count=1)

and predict data并预测数据

from PIL import Image 
data = Image.open('./samples/inputs/1.jpg')
result = predictor.predict(data)
img = Image.open(result)
img.show()

as a result I got an error that I cannot load my model结果我得到一个错误,我无法加载我的 model

you can see the error log as well in here您也可以在此处查看错误日志

I didn't make it fully, but loading model is working well now.我没有完全做到,但加载 model 现在运行良好。 It was due to structure of model.tar.gz and inference.py.这是由于 model.tar.gz 和 inference.py 的结构。

my model.tar.gz is structured like below我的 model.tar.gz 的结构如下

model.tar.gz model.tar.gz
| |
|- model.pt |- model.pt
|- code/ |- 代码/
| | | | | | - code/inference.py - 代码/inference.py
| | | | | | - code/requirements.txt - 代码/requirements.txt

This structure is required.这种结构是必需的。

Sometimes, model.pt goes empty file somehow when you make a model.tar.gz, so it is better to check model.pt size is not "0" b4 make tar.gz file.有时,当您制作 model.tar.gz 时,model.pt 会以某种方式变为空文件,因此最好检查 model.pt 文件的大小不是“0”b4

In addition, if you get wrong with inference.py, such as input_fn, predict_fn, and output_fn, it may occur Backend worker process died even if model_fn works properly in my experience.另外,如果你对inference.py有错误,例如input_fn、predict_fn和output_fn,根据我的经验,即使model_fn正常工作,也可能会出现Backend worker process died dead。

I am struggling with request_body now.我现在正在为 request_body 苦苦挣扎。 I keep get request_body as bytearray even if I input JSON file.即使我输入 JSON 文件,我也会将 request_body 保持为字节数组。 my expectation was request_body had to be JSON, but its not.我的期望是 request_body 必须是 JSON,但事实并非如此。

I will update if I solve this problem.如果我解决了这个问题,我会更新。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 在本地加载 Amazon Sagemaker NTM 模型以进行推理 - Load Amazon Sagemaker NTM model locally for inference 我可以在Sagemaker上部署预训练的sklearn模型(在S3中刺刺)吗? - Can i deploy pretrained sklearn model (pickle in s3) on sagemaker? 加载预训练 pytorch model - Load pretrained pytorch model 无法加载预训练的模型 - Unable to load a pretrained model 是否可以在不首先在本地持久化的情况下从 GCS 存储桶 URL 加载预训练的 Pytorch 模型? - Is it possible to load a pretrained Pytorch model from a GCS bucket URL without first persisting locally? 如何加载部分预训练的 pytorch model? - How can I load a partial pretrained pytorch model? 如何通过 pytorch 加载预训练的 model? (mm时尚) - how can I load pretrained model by pytorch? ( mmfashion) 将预先培养的咖啡模型加载到千层面? - Load a pretrained caffe model to lasagne? 将 S3 数据加载到 AWS SageMaker Notebook - Load S3 Data into AWS SageMaker Notebook 我想知道 Sagemaker 的 model.tar.gz 是否可以用于在另一个笔记本中进行推理? - I wonder if model.tar.gz from Sagemaker can be used for inference in another notebook?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM