简体   繁体   English

在AWS SageMaker上重新托管经过训练的模型

[英]Re-hosting a trained model on AWS SageMaker

I have started exploring AWS SageMaker starting with these examples provided by AWS . 我已经开始从AWS提供的这些示例开始探索AWS SageMaker。 I then made some modifications to this particular setup so that it uses the data from my use case for training. 然后,我对该特定设置进行了一些修改,以便它使用我的用例中的数据进行培训。

Now, as I continue to work on this model and tuning, after I delete the inference endpoint once, I would like to be able to recreate the same endpoint -- even after stopping and restarting the notebook instance (so the notebook / kernel session is no longer valid) -- using the already trained model artifacts that gets uploaded to S3 under /output folder. 现在,当我继续研究该模型并进行调整时,一次删除推理端点后,我希望能够重新创建相同的端点-即使在停止并重新启动笔记本实例之后(因此,笔记本/内核会话不再有效)-使用已经训练好的模型工件,该工件被上传到/ output文件夹下的S3。

Now I cannot simply jump directly to this line of code: 现在,我不能简单地直接跳到以下代码行:

bt_endpoint = bt_model.deploy(initial_instance_count = 1,instance_type = 'ml.m4.xlarge')

I did some searching -- including amazon's own example of hosting pre-trained models , but I am a little lost. 我进行了一些搜索-包括亚马逊自己的托管预训练模型的示例 ,但我有些失落。 I would appreciate any guidance, examples, or documentation that I could emulate and adapt to my case. 我将不胜感激,可以模仿并适应我的情况下提供的任何指导,示例或文档。

Your comment is correct - you can re-create an Endpoint given an existing EndpointConfiguration. 您的评论是正确的-给定现有EndpointConfiguration,您可以重新创建一个Endpoint。 This can be done via the console, the AWS CLI, or the SageMaker boto client. 可以通过控制台,AWS CLI或SageMaker boto客户端来完成。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 在AWS SageMaker上构建的受训DeepAR模型的本地托管 - On-Premises Hosting of Trained DeepAR Model built on AWS SageMaker 用于模型托管的 AWS Sagemaker 与 ECS - AWS Sagemaker vs ECS for model hosting 如何将重新训练的 Sagemaker model 部署到端点? - How can I deploy a re-trained Sagemaker model to an endpoint? 如何在 AWS sagemaker 中运行预训练的 model? - how to run a pre-trained model in AWS sagemaker? AWS SageMaker - 如何加载经过训练的 sklearn model 以进行推理? - AWS SageMaker - How to load trained sklearn model to serve for inference? 借助 AWS SageMaker,是否可以使用 sagemaker SDK 部署预训练模型? - With AWS SageMaker, is it possible to deploy a pre-trained model using the sagemaker SDK? 在 SageMaker 上更改训练模型中的预处理 - Changing preprocessing in trained model on SageMaker 将经过训练的模型加载到 SageMaker Estimator - Loading trained model in to SageMaker Estimator 如何使用 AWS SageMaker Notebook 实例部署预训练的 model? - How to deploy a Pre-Trained model using AWS SageMaker Notebook Instance? 在 aws sagemaker 上部署预训练的 tensorflow model - ModelError:调用 InvokeEndpoint 操作时发生错误 (ModelError) - Deploy pre-trained tensorflow model on the aws sagemaker - ModelError: An error occurred (ModelError) when calling the InvokeEndpoint operation
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM