简体   繁体   中英

.py file in EC2 instance to execute on event from S3 Bucket

i have a .py file thats on an EC2 instance. Im trying to have the .py file run when an event(file uploaded to S3 Bucket) occurs.

I currently have an event notification that is sent to a AWS Lambda function that starts the EC2 instance, here is that code from the AWS console:

import boto3

id = [ec2-ID]

def lambda_handler(event, context):
    ec2 = boto3.client('ec2')
    ec2.start_instances(InstanceIds=id)

i can manually go into PuTTY and type in "python test.py" to run my program and it works, but i want to get rid of the "having to do it manually part" and have it just run itself whenever there is an event.

I am stumped as to how to progress.

I thought by "starting" my EC2 instance it would run that .py file and get to work processing whats in the S3 bucket

no error messages...it just doesnt do anything at all. Its suppose to work once a file is uploaded to the S3 bucket it should send a notification to the lambda to have the EC2 start processing the file with the .py file that is on it.

Kind regards

This is a nice trick you can try - https://aws.amazon.com/premiumsupport/knowledge-center/execute-user-data-ec2/

This should override the fact User Data is executed only on instance first creation. This method will allow you to execute User Data scripts on every boot. Just update the bash from:

/bin/echo "Hello World" >> /tmp/testfile.txt

to:

python /file_path/python_file.py &

Use Cron:

$ sudo apt-get install cron
$ crontab -e

# option 3 vim

#Type "i" to insert text below

@reboot python /path_directory/python_test.py &

#Type ":wq" to save and exit

To find the .py file, run:

sudo find / -type f -iname "python_test.py"

Then add the path to Cron.

Ttake a look at AWS Systems Manager Run Command as a way to run arbitrary scripts on EC2. You can do that from your boto3 client, but you'll probably have to use a boto3 waiter to wait for the EC2 instance to restart.

Note that if you're only starting the EC2 instance and running this script infrequently then it might be more cost-effective to simply launch a new EC2 instance, run your script, then terminate EC2. While the EC2 instance is stopped, you are charged for EBS storage associated with the instance and any unused Elastic IP addresses.

If all you need is to run some python code and the main limitation is running time, it might be a better idea to use lambda to listen to the S3 event, and Fargate to execute the task. The main advantage is you don't have to worry about starting/stopping your instance, and scaling out would be easier.

There is a nice write-up of a working use case at the serverless blog

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM