简体   繁体   English

在 sagemaker AWS 中导入 PKL 发布 API

[英]Import PKL in sagemaker AWS to publish API

I have a.pkl file that is the result of a trained model. And I want to create an endpoint from sage maker to be able to consume the predictions, and I have already managed to read the file from s3 but I can't find exact documentation on how to expose the "compiled" as API我有一个 .pkl 文件,它是经过训练的 model 的结果。我想从 sage maker 创建一个端点以便能够使用预测,我已经设法从 s3 读取文件但我找不到关于如何将“已编译”公开为 API 的确切文档

s3 = boto3.resource('s3')
bucket = s3.Bucket("sagemake-models-workshop").Object("pikle- 
file/contatos/xgb_contratos_mensual_RandomizedSearchLinux.pkl").get()['Body'].read()

bucket_pickle = pickle.loads(bucket)

output: output:

bucket_pickle

XGBRegressor(base_score=0.5, booster='gbtree', colsample_bylevel=1,
         colsample_bynode=1, colsample_bytree=0.1, gamma=0, gpu_id=-1,
         importance_type='gain', interaction_constraints='',
         learning_rate=0.33, max_delta_step=0, max_depth=3,
         min_child_weight=1, missing=nan, monotone_constraints='()',
         n_estimators=150, n_jobs=0, num_parallel_tree=1, random_state=0,
         reg_alpha=0, reg_lambda=1, scale_pos_weight=1, subsample=1,
         tree_method='exact', validate_parameters=1, verbosity=None)

If your model is an XGBoost model you can look at deploying it using the XGBoost Framework container.如果您的 model 是 XGBoost model,您可以考虑使用 XGBoost Framework 容器部署它。 Please see this link for details on Bring Your Own Model for XGBoost.有关 XGBoost 自带 Model的详细信息,请参阅此链接。

I work for AWS and my opinions are my own.我在 AWS 工作,我的意见是我自己的。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM