简体   繁体   English

如何在 AWS SageMaker 中部署自定义 model?

[英]How to deploy a custom model in AWS SageMaker?

I have a custom machine learning predictive model.我有一个自定义机器学习预测 model。 I also have a user defined Estimator class that uses Optuna for hyperparameter tuning.我还有一个用户定义的 Estimator class,它使用 Optuna 进行超参数调整。 I need to deploy this model to SageMaker so as to invoke it from a lambda function.我需要将此 model 部署到 SageMaker,以便从 lambda function 调用它。

I'm facing trouble in the process of creating a container for the model and the Estimator.我在为 model 和 Estimator 创建容器的过程中遇到了麻烦。

I am aware that SageMaker has a scikit learn container which can be used for Optuna, but how would I leverage this to include the functions from my own Estimator class?我知道 SageMaker 有一个可用于 Optuna 的 scikit learn 容器,但我将如何利用它来包含我自己的 Estimator class 中的功能? Also, the model is one of the parameters passed to this Estimator class so how do I define it as a separate training job in order to make it an Endpoint?此外,model 是传递给此 Estimator class 的参数之一,那么如何将其定义为单独的训练作业以使其成为端点?

This is how the Estimator class and the model are invoked:这是调用 Estimator class 和 model 的方式:

sirf_estimator = Estimator(
    SIRF, ncov_df, population_dict[countryname],
    name=countryname, places=[(countryname, None)],
    start_date=critical_country_start
    )
sirf_dict = sirf_estimator.run()

where:在哪里:

  1. Model Name: SIRF Model 名称:SIRF
  2. Cleaned Dataset: ncov_df清理数据集:ncov_df

Would be really helpful if anyone could look into this, thanks a ton!如果有人可以调查一下,那将非常有帮助,非常感谢!

The SageMaker inference endpoints currently rely on an interface based on Docker images. SageMaker 推理端点当前依赖于基于 Docker 图像的接口。 At the base level, you can set up a Docker image that runs a web server and responds to the endpoints on the ports that AWS require.在基础级别,您可以设置一个 Docker 映像,该映像运行 web 服务器并响应 AWS 需要的端口上的端点。 This guide will show you how to do it: https://docs.aws.amazon.com/sagemaker/latest/dg/your-algorithms-inference-code.html .本指南将向您展示如何操作: https://docs.aws.amazon.com/sagemaker/latest/dg/your-algorithms-inference-code.html

This is an annoying amount of work.这是一项烦人的工作。 If you're using a well-known framework they have a container library that contains some boilerplate code you might be able to reuse: https://github.com/aws/sagemaker-containers .如果您使用的是众所周知的框架,他们有一个容器库,其中包含一些您可以重用的样板代码: https://github.com/aws/sagemaker-containers You might be able to reuse some code from there, but customize it.您也许可以从那里重用一些代码,但要对其进行自定义。

Or don't use SageMaker inference endpoints at all:) If your model can fit within the size / memory restrictions of AWS Lambda, that is an easier option!或者根本不使用 SageMaker 推理端点:) 如果您的 model 可以适应 AWS Lambda 的大小/memory 限制,那将是一个更简单的选择!

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM