简体   繁体   English

如何通过boto3在EC2实例上运行python脚本

[英]How to run python script on EC2 instance by boto3

I read this question before How to SSH and run commands in EC2 using boto3? 我在如何使用boto3 SSH和在EC2中运行命令之前阅读了此问题 . Many answers just said users don't have to use ssh to connect to EC2 and run command. 许多回答只是说用户不必使用ssh连接到EC2并运行命令。 However, I still don't have a clue how to run a python script by boto3. 但是,我仍然不知道如何通过boto3运行python脚本。 In boto2, this is a function run_instances which user can pass their script into EC2 node and run it, just like the code list as below 在boto2中,这是一个run_instances函数,用户可以将其脚本传递到EC2节点并运行它,就像下面的代码列表一样

def run(self, **kwargs):
    ec2 = boto.connect_ec2(settings.PDF_AWS_KEY, settings.PDF_AWS_SECRET)
    sqs = boto.connect_sqs(settings.PDF_AWS_KEY, settings.PDF_AWS_SECRET)

    queue = sqs.create_queue(REQUEST_QUEUE)
    num = queue.count()
    launched = 0
    icount = 0

    reservations = ec2.get_all_instances()
    for reservation in reservations:
        for instance in reservation.instances:
            if instance.state == "running" and instance.image_id == AMI_ID:
                icount += 1
    to_boot = min(num - icount, MAX_INSTANCES)

    if to_boot > 0:
        startup = BOOTSTRAP_SCRIPT % {
            'KEY': settings.PDF_AWS_KEY,
            'SECRET': settings.PDF_AWS_SECRET,
            'RESPONSE_QUEUE': RESPONSE_QUEUE,
            'REQUEST_QUEUE': REQUEST_QUEUE}
        r = ec2.run_instances(
            image_id=AMI_ID,
            min_count=to_boot,
            max_count=to_boot,
            key_name=KEYPAIR,
            security_groups=SECURITY_GROUPS,
            user_data=startup)
        launched = len(r.instances)
    return launched

BOOTSTRAP_SCRIPT is a python script

I write some code with boto3: 我用boto3编写一些代码:

# -*- coding: utf-8 -*-

SCRIPT_TORUN = """

import boto3

bucket = random_str()
image_name = random_str()
s3 = boto3.client('s3')
Somedata = 'hello,update'

upload_path = 'test/' + image_name
s3.put_object(Body=Somedata, Bucket='cloudcomputing.assignment.storage', Key=upload_path)

"""

import boto3
running_instance = []
ec2 = boto3.resource('ec2')

for instance in ec2.instances.all():
    if instance.state['Name'] == 'running':         # Choose running instance and save their instance_id
        running_instance.append(instance.id)
        print instance.id, instance.state
print running_instance

I can get the details of running instances, can anybody tell me is there a function like run_instances in boto3 , which I can use to run the script SCRIPT_TORUN in one of my running EC2 instances. 我可以获取正在运行的实例的详细信息,有人可以告诉我run_instances中是否有类似run_instancesboto3 ,我可以使用该SCRIPT_TORUN在我正在运行的EC2实例之一中运行脚本SCRIPT_TORUN

See: Boto3 run_instances 请参阅: Boto3 run_instances

The parameter you are looking for is: UserData='string' 您正在寻找的参数是: UserData ='string'

UserData (string) -- UserData (string) -

The user data to make available to the instance. 实例可用的用户数据。 For more information, see Running Commands on Your Linux Instance at Launch (Linux) and Adding User Data (Windows). 有关更多信息,请参阅启动时在Linux实例上运行命令(Linux)和添加用户数据(Windows)。 If you are using a command line tool, base64-encoding is performed for you, and you can load the text from a file. 如果使用命令行工具,则将为您执行base64编码,并且可以从文件中加载文本。 Otherwise, you must provide base64-encoded text. 否则,您必须提供base64编码的文本。

This value will be base64 encoded automatically. 此值将自动进行base64编码。 Do not base64 encode this value prior to performing the operation. 在执行该操作之前,请勿对此值进行base64编码。

If you want to run the script once and only once, specifically at EC2 launch time, then you can provide the script in userdata when you call run_instances . 如果要一次(仅在EC2启动时)运行一次脚本,则可以在调用run_instances时在userdata中提供脚本。

If, however, you want to run a script on one (or more) EC2 instances on an ad hoc basis, then you should look at either EC2 Systems Manager (the Run Command ) or something like Fabric ( example ). 但是,如果要临时在一个(或多个)EC2实例上运行脚本,则应查看EC2 Systems Manager( Run Command )或类似Fabric示例 )的东西。

Here is how I have done 这是我做的

import boto3
import botocore
import boto
import paramiko

ec2 = boto3.resource('ec2')

instances = ec2.instances.filter(
Filters=[{'Name': 'instance-state-name', 'Values': ['running']}])
i = 0
for instance in instances:
  print(instance.id, instance.instance_type)
  i+= 1
x = int(input("Enter your choice: "))
try:
  ssh = paramiko.SSHClient()
  ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
  privkey = paramiko.RSAKey.from_private_key_file('address to .pem key')
  ssh.connect(instance.public_dns_name,username='ec2-user',pkey=privkey)
  stdin, stdout, stderr = ssh.exec_command('python input_x.py')
  stdin.flush()
  data = stdout.read().splitlines()
for line in data:
    x = line.decode()
    #print(line.decode())
    print(x,i)
    ssh.close()
except:
  --------

for credentials I have added AWSCLI package, the open terminal and run command 对于凭据,我添加了AWSCLI程序包,打开终端并运行命令

aws configure

and enter the details, which will be saved and automatically read by boto3 from .aws folder. 并输入详细信息,这些信息将被保存并由boto3从.aws文件夹自动读取。

Here is how to do it using another python lib called paramiko 这是使用另一个名为paramiko的python lib来做的

import paramiko

user_name='ubuntu'
instance_id='i-08h873123123' #just an example
pem_addr='/Users/folder1/.ssh/jack-aws.pem' # folder path to aws instance key
aws_region='us-east-1' 


instances = ec2.instances.filter(Filters=[{'Name': 'instance-state-name', 'Values': ['running']}])


for instance in instances:
    if (instance.id==instance_id):
        p2_instance=instance
        break;



ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
privkey = paramiko.RSAKey.from_private_key_file(pem_addr)
ssh.connect(p2_instance.public_dns_name,username=user_name,pkey=privkey)


cmd_to_run='dropbox start && source /home/ubuntu/anaconda3/bin/activate py36 && cd /home/ubuntu/xx/yy/ && python3 func1.py' #you can seperate two shell commands by && or ;
stdin4, stdout4, stderr4 = ssh.exec_command(cmd_to_run,timeout=None, get_pty=False)
ssh.close()

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM