简体   繁体   English

如何在本地环境中使用来自 Amazon Sagemaker 的 a.joblib model?

[英]How to use a .joblib model from Amazon Sagemaker in a local environment?

I created a model in AWS using Sagemaker.我使用 Sagemaker 在 AWS 中创建了 model。 I downloaded model.joblib to my machine.我将 model.joblib 下载到我的机器上。 I am trying to use it to make some predictions.我试图用它来做一些预测。 I can load the file:我可以加载文件:

import joblib
import mlio
import sklearn
filename=r"C:\Users\benki\Downloads\model.tar\model.joblib"
loaded_model = joblib.load(filename)

However, I am not sure where to go from here.但是,我不确定 go 从这里到哪里。 I've tried to score and predict 'loaded_model' but I only get error messages explaining that 'loaded_model' does not have these attributes.我试图对“loaded_model”进行评分和预测,但我只收到错误消息,解释“loaded_model”没有这些属性。 'loaded_model' is the type: sagemaker_sklearn_extension.externals.automl_transformer.AutoMLTransformer 'loaded_model' 是类型:sagemaker_sklearn_extension.externals.automl_transformer.AutoMLTransformer

In AWS from a Sagemaker Jupyter Notebook instance, I can make predictions with the following:在来自 Sagemaker Jupyter Notebook 实例的 AWS 中,我可以使用以下内容进行预测:

endpoint_name = "My_Model"  
#print(f"Note: Invoking Endpoint: {endpoint_name}")
content_type = "text/csv"                                        
accept = "text/csv"  

# create for loop to create results, one at a time
predict=[]
for sample in payloads:
    response = client.invoke_endpoint(
    EndpointName=endpoint_name,
    ContentType=content_type,
    Body=sample
    )
    #print('inference complete')
    inference = (response['Body'].read().decode('ascii'))
    predict.append((sample,inference))

How do I engage this joblib model?如何使用这个 joblib model?

Taking a look at both the exact error log and your entire local inference code might prove helpful in this case.在这种情况下,查看确切的错误日志和整个本地推理代码可能会有所帮助。 However, I do have some ideas for you to check in the meantime:但是,与此同时,我确实有一些想法供您检查:

  • SageMaker endpoints are deployed over full application enviroments, where code and artifacts are integrated. SageMaker 端点部署在完整的应用程序环境中,其中集成了代码和工件。 It is likely that you lack some of the resources SM uses to set up inference, such as transformer definitions or pre-processing steps.您可能缺少 SM 用于设置推理的一些资源,例如转换器定义或预处理步骤。

  • Here you can see the source code of the AutoMLTransformer resource. 在这里您可以看到AutoMLTransformer资源的源代码。 It is worth noting that its constructor requires access to feature and target Sklearn transformer objects containing fit() and transform() methods.值得注意的是,它的构造函数需要访问包含fit()transform()方法的特征目标Sklearn 转换器对象。 Are these part of your local pipeline?这些是您当地管道的一部分吗?

  • After training a model in SageMaker, some artifacts are created and stored in a user-defined key path in Amazon S3 (tipically as s3://bucket/your_artifacts_path/model.tar.gz ).在 SageMaker 中训练 model 后,会创建一些工件并将其存储在 Amazon S3 中用户定义的密钥路径中(通常为s3://bucket/your_artifacts_path/model.tar.gz )。 When decompressing, you should be able to see a model/ folder containing the model.joblib artifact and the code/ sub-folder.解压时,您应该能够看到一个model/文件夹,其中包含model.joblib工件和code/子文件夹。 Inside the latter, automatically generated reference Python scripts for both data pre-processing and serving are available.在后者内部,可以使用自动生成的参考 Python 脚本,用于数据预处理和服务。 Sure, some parts of them might not run as-is in your local environment, but try using the serving script as a reference to determine how to pre-process incoming records and make calls against the model object.当然,它们中的某些部分可能不会在您的本地环境中按原样运行,但请尝试使用服务脚本作为参考来确定如何预处理传入记录并针对 model object 进行调用。

Let me know if you find something else!如果你发现别的东西,请告诉我!

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 从本地jupyter notebook部署model到amazon sagemaker - Deploy model from local jupyter notebook to amazon sagemaker 在R环境中部署Amazon sagemaker生成的XGBoost模型 - Deploy an Amazon sagemaker-generated XGBoost model in R environment 如何在亚马逊Sagemaker上部署xgboost模型? - how to deploy a xgboost model on amazon sagemaker? 如何从本地环境以编程方式(在 Sagemaker 笔记本内)运行 jupyter 笔记本 - How to run a jupyter notebook programmatically (inside a Sagemaker notebook) from a local environment 如何在 Sagemaker 中获取特定 model 图像的 Amazon ECR 容器 URI? - How to get an Amazon ECR container URI for a specific model image in Sagemaker? 如何从 sagemaker 'model.tar.gz' 文件中使用 scikit 学习模型? - How to use scikit learn model from inside sagemaker 'model.tar.gz' file? 如何使用 joblib 或 pickle 导出从 KerasClassifier 和 Gridsearchcv 创建的模型? - How to export a model created from KerasClassifier and Gridsearchcv using joblib or pickle? 在本地加载 Amazon Sagemaker NTM 模型以进行推理 - Load Amazon Sagemaker NTM model locally for inference 将 tensorflow 模型部署到 Amazon SageMaker 时出现 ValueError - ValueError while deploying tensorflow model to Amazon SageMaker Amazon SageMaker从模型工件进行部署-我们要从档案库加载哪个对象? - Amazon SageMaker deploying from model artifacts - what object do we load from archive?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM