简体   繁体   中英

How to autoscale a SKLearn job on sagemaker

I want to launch a SKLearn job using sagemaker . The way I do this is as follows:

from sagemaker.sklearn.estimator import SKLearn

FRAMEWORK_VERSION = '0.23-1' 
script_path = 'main.py'

sklearn = SKLearn(
    entry_point=os.path.join(script_path),
    framework_version=FRAMEWORK_VERSION,
    instance_type='ml.m5.2xlarge',
    source_dir='src',
    output_path='my/output/path',
)

I am not sure if the instance_type that I have chosen is enough (in terms of memory etc) for my application though.

Is there a way to "let sagemaker" decide on the instance type?

Or, is there a way to choose an instance_type and if along the way it is about to run out of memory, the sagemaker to automatically scale up?

Automatic scale-up feature for Training doesn't exist in SageMaker at this time.

On a separate note, for selecting the right instance type for inference, we have an instance recommender service ( https://docs.aws.amazon.com/sagemaker/latest/dg/inference-recommender.html ).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM