简体   繁体   中英

can we pass a dynamic variable to aws step functions on execution?

I am using the step functions data science SDK using python . I have a task that runs every day and the path of the data that is to be accessed in certain steps of the step functions keeps changing every day as it has the date parameter.

How can I pass the date parameter when I execute the step function and use it so that I can access new data every day automatically.

This is an example of a step I am adding to the workflow.


etl_step = steps.GlueStartJobRunStep(
    'Extract, Transform, Load',
    parameters={"JobName": execution_input['GlueJobName'],
                "Arguments":{
                    '--S3_SOURCE': data_source,
                    '--S3_DEST': 's3a://{}/{}/'.format(bucket, project_name),
                    '--TRAIN_KEY': train_prefix + '/',
                    '--VAL_KEY': val_prefix +'/'}
               }
)

I want to add the date variable to the S3_DEST. If I use execution_input, the type isn't string so I cannot concatenate it for the path.

Edit

If the date is a datetime object you can use datetime.strftime('%Y-%m-%d')` to output it as a string.

Original

Step functions support input into them.

If you're using the SDK for start_execution then you can use the input parameter.

If you have CloudWatch event you can specify a constant from the console.

在此处输入图像描述

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM