简体   繁体   中英

How to run python script on EC2 instance by boto3

I read this question before How to SSH and run commands in EC2 using boto3? . Many answers just said users don't have to use ssh to connect to EC2 and run command. However, I still don't have a clue how to run a python script by boto3. In boto2, this is a function run_instances which user can pass their script into EC2 node and run it, just like the code list as below

def run(self, **kwargs):
    ec2 = boto.connect_ec2(settings.PDF_AWS_KEY, settings.PDF_AWS_SECRET)
    sqs = boto.connect_sqs(settings.PDF_AWS_KEY, settings.PDF_AWS_SECRET)

    queue = sqs.create_queue(REQUEST_QUEUE)
    num = queue.count()
    launched = 0
    icount = 0

    reservations = ec2.get_all_instances()
    for reservation in reservations:
        for instance in reservation.instances:
            if instance.state == "running" and instance.image_id == AMI_ID:
                icount += 1
    to_boot = min(num - icount, MAX_INSTANCES)

    if to_boot > 0:
        startup = BOOTSTRAP_SCRIPT % {
            'KEY': settings.PDF_AWS_KEY,
            'SECRET': settings.PDF_AWS_SECRET,
            'RESPONSE_QUEUE': RESPONSE_QUEUE,
            'REQUEST_QUEUE': REQUEST_QUEUE}
        r = ec2.run_instances(
            image_id=AMI_ID,
            min_count=to_boot,
            max_count=to_boot,
            key_name=KEYPAIR,
            security_groups=SECURITY_GROUPS,
            user_data=startup)
        launched = len(r.instances)
    return launched

BOOTSTRAP_SCRIPT is a python script

I write some code with boto3:

# -*- coding: utf-8 -*-

SCRIPT_TORUN = """

import boto3

bucket = random_str()
image_name = random_str()
s3 = boto3.client('s3')
Somedata = 'hello,update'

upload_path = 'test/' + image_name
s3.put_object(Body=Somedata, Bucket='cloudcomputing.assignment.storage', Key=upload_path)

"""

import boto3
running_instance = []
ec2 = boto3.resource('ec2')

for instance in ec2.instances.all():
    if instance.state['Name'] == 'running':         # Choose running instance and save their instance_id
        running_instance.append(instance.id)
        print instance.id, instance.state
print running_instance

I can get the details of running instances, can anybody tell me is there a function like run_instances in boto3 , which I can use to run the script SCRIPT_TORUN in one of my running EC2 instances.

See: Boto3 run_instances

The parameter you are looking for is: UserData='string'

UserData (string) --

The user data to make available to the instance. For more information, see Running Commands on Your Linux Instance at Launch (Linux) and Adding User Data (Windows). If you are using a command line tool, base64-encoding is performed for you, and you can load the text from a file. Otherwise, you must provide base64-encoded text.

This value will be base64 encoded automatically. Do not base64 encode this value prior to performing the operation.

If you want to run the script once and only once, specifically at EC2 launch time, then you can provide the script in userdata when you call run_instances .

If, however, you want to run a script on one (or more) EC2 instances on an ad hoc basis, then you should look at either EC2 Systems Manager (the Run Command ) or something like Fabric ( example ).

Here is how I have done

import boto3
import botocore
import boto
import paramiko

ec2 = boto3.resource('ec2')

instances = ec2.instances.filter(
Filters=[{'Name': 'instance-state-name', 'Values': ['running']}])
i = 0
for instance in instances:
  print(instance.id, instance.instance_type)
  i+= 1
x = int(input("Enter your choice: "))
try:
  ssh = paramiko.SSHClient()
  ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
  privkey = paramiko.RSAKey.from_private_key_file('address to .pem key')
  ssh.connect(instance.public_dns_name,username='ec2-user',pkey=privkey)
  stdin, stdout, stderr = ssh.exec_command('python input_x.py')
  stdin.flush()
  data = stdout.read().splitlines()
for line in data:
    x = line.decode()
    #print(line.decode())
    print(x,i)
    ssh.close()
except:
  --------

for credentials I have added AWSCLI package, the open terminal and run command

aws configure

and enter the details, which will be saved and automatically read by boto3 from .aws folder.

Here is how to do it using another python lib called paramiko

import paramiko

user_name='ubuntu'
instance_id='i-08h873123123' #just an example
pem_addr='/Users/folder1/.ssh/jack-aws.pem' # folder path to aws instance key
aws_region='us-east-1' 


instances = ec2.instances.filter(Filters=[{'Name': 'instance-state-name', 'Values': ['running']}])


for instance in instances:
    if (instance.id==instance_id):
        p2_instance=instance
        break;



ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
privkey = paramiko.RSAKey.from_private_key_file(pem_addr)
ssh.connect(p2_instance.public_dns_name,username=user_name,pkey=privkey)


cmd_to_run='dropbox start && source /home/ubuntu/anaconda3/bin/activate py36 && cd /home/ubuntu/xx/yy/ && python3 func1.py' #you can seperate two shell commands by && or ;
stdin4, stdout4, stderr4 = ssh.exec_command(cmd_to_run,timeout=None, get_pty=False)
ssh.close()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM