简体   繁体   中英

Load a Picked or Joblib Pre trained ML Model to Sagemaker and host as endpoint

If I have a trained model in Using pickle, or Joblib. Lets say its Logistic regression or XGBoost.

I would like to host that model in AWS Sagemaker as endpoint without running a training job. How to achieve that.

#Lets Say myBucketName contains model.pkl
model = joblib.load('filename.pkl')  
# X_test = Numpy Array 
model.predict(X_test)  

I am not interested to sklearn_estimator.fit('S3 Train, S3 Validate' ) , I have the trained model

For Scikit Learn for example, you can get inspiration from this public demo https://github.com/awslabs/amazon-sagemaker-examples/blob/master/sagemaker-python-sdk/scikit_learn_randomforest/Sklearn_on_SageMaker_end2end.ipynb

Step 1: Save your artifact (eg the joblib) compressed in S3 at s3://<your path>/model.tar.gz

Step 2: Create an inference script with the deserialization function model_fn . (Note that you could also add custom inference functions input_fn , predict_fn , output_fn but for scikit the defaults function work fine)

%%writefile inference_script.py. # Jupiter command to create file in case you're in Jupiter

import joblib
import os

def model_fn(model_dir):
    clf = joblib.load(os.path.join(model_dir, "model.joblib"))
    return clf

Step 3: Create a model associating the artifact with the right container

from sagemaker.sklearn.model import SKLearnModel

model = SKLearnModel(
    model_data='s3://<your path>/model.tar.gz',
    role='<your role>',
    entry_point='inference_script.py',
    framework_version='0.23-1')

Step 4: Deploy!

model.deploy(
    instance_type='ml.c5.large',  # choose the right instance type
    initial_instance_count=1)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM