简体   繁体   English

每当触发 lambda 时,如何运行 AWS EC2 中存在的 python 脚本?

[英]How to run a python script present in an AWS EC2 whenever a lambda is triggered?

I have a python script present in my AWS EC2 instance that does some job.我的 AWS EC2 实例中有一个 Python 脚本可以完成一些工作。 I have to trigger that script whenever a new file enters a particular bucket.每当新文件进入特定存储桶时,我都必须触发该脚本。

My idea was to add a lambda trigger to that bucket which in turns triggers the script present in EC2, but failed to do so.我的想法是向该存储桶添加一个 lambda 触发器,该触发器又会触发 EC2 中存在的脚本,但未能这样做。

So how to achieve the solution if according to my plan or is there any other workarounds for this problem?那么,如果按照我的计划或者是否有其他解决方法可以解决此问题,如何实现?

As suggested in comment better to use SNS or SQS , I think it more suitable then lambda function and with SNS or SQS involve one to one communication between S3 and EC2 instance then why you should add an extra layer of lambda?.正如评论中建议最好使用SNSSQS ,我认为它比 lambda 函数更合适,并且 SNS 或 SQS 涉及S3EC2实例之间的一对一通信,那么为什么要添加额外的 lambda 层?

Although three can subscribe to the event but lambda involves one extra layer and also involve ssh too which I think costly in term of time (s3 event receiving + event process + ssh to ec2).虽然三个可以订阅事件,但 lambda 涉及一个额外的层,也涉及 ssh,我认为这在时间上很昂贵(s3 事件接收 + 事件处理 + ssh 到 ec2)。

在此处输入图片说明

Using Lambda:使用 Lambda:

When lambda trigger it will start doing ssh to ec2 and will run the script and there is one big advantage with Lambda is that you can run any type of script and you do not need a server to keep them up and running like in case of SQS and SNS.当 lambda 触发器将开始对 ec2 执行ssh并运行脚本时,Lambda 的一大优势是您可以运行任何类型的脚本,并且不需要服务器来保持它们的正常运行,就像 SQS 的情况一样和社交网络。 you can explore these example ssh-ec2-lambda/ and scheduling-ssh-jobs-using-aws-lambda , the second example is similar just you need based on the event instead of scheduling.您可以探索这些示例ssh-ec2-lambda/schedule-ssh-jobs-using-aws-lambda ,第二个示例类似,只是您需要基于事件而不是调度。

SNS:社交网络:

If multiple instances suppose to run the job script on ec2 instance the SNS is a better choice.如果多个实例假设在 ec2 实例上运行作业脚本,则 SNS 是更好的选择。 The diagram is a bit similar to your use cases or for representing a big picture.该图有点类似于您的用例或用于表示大图。

在此处输入图片说明

SQS:质量标准:

If there is only one instance is supposed to run the script then SQS will be suitable to handle the event.如果应该只有一个实例运行脚本,那么 SQS 将适合处理该事件。

在此处输入图片说明

  • I am not sure why your option did not work, because its absolutely possible and i have done this using this blog aws blog我不确定为什么您的选择不起作用,因为它绝对有可能,我已经使用此博客aws 博客完成了此操作
  • This git repository has code to trigger a lambda whenever an file with a specific extension is uploaded to the bucket (terraform).每当具有特定扩展名的文件上传到存储桶 (terraform) 时,此 git存储库都有触发 lambda 的代码。
  • You can access the EC2 instance via the lambda as shown in the block above using tags.您可以使用标签通过 lambda 访问 EC2 实例,如上面的块所示。
  • Hope this all helps you.希望这一切对你有帮助。

I managed it with the help of a blog I found online, which I have lost the link of, but have the code.我在网上找到的一个博客的帮助下管理了它,我已经失去了它的链接,但有代码。

import time
import boto3
import paramiko
import os

def lambda_handler(event, context):

    ec2 = boto3.resource('ec2', region_name='us-east-1',aws_access_key_id='XXXXXXXXXXXXXXXXXXXX',aws_secret_access_key='XXXXXXXXXXXXXXXXXXXX')

    instance_id = 'XXXXXXXXXXXXXXXX'

    instance = ec2.Instance(instance_id)

    # Start the instance
    instance.start()

    # Giving some time to start the instance completely
    #time.sleep(60)

    # Connect to S3, we will use it get the pem key file of your ec2 instance
    s3_client = boto3.client('s3',aws_access_key_id='XXXXXXXXXXXXXXXXXXXX',aws_secret_access_key='XXXXXXXXXXXXXXXXXXXX')

    # # # Download private key file from secure S3 bucket
    # # # and save it inside /tmp/ folder of lambda event
    bucket_name = ''
    key_name = ''
    key_location = ''
    s3_client.download_file(bucket_name, key_name, key_location)

    # # # # Allowing few seconds for the download to complete
    time.sleep(10)

    ssh = paramiko.SSHClient()
    ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
    privkey = paramiko.RSAKey.from_private_key_file(key_location)
    # # # username is most likely 'ec2-user' or 'root' or 'ubuntu'
    # # # depending upon yor ec2 AMI
    ssh.connect(instance.private_ip_address,22, username='ubuntu', pkey=privkey)



    commands = []
    for command in commands:
        print("Executing {}".format(command))
        stdin , stdout, stderr = ssh.exec_command(command)
        stdin.flush()
        data = stdout.read().splitlines()
        for line in data:
            print(line)
    ssh.close()

    return 'Success'

Now just zip the paramiko library along with it.现在只需压缩 paramiko 库。 Will udate the answer if I found the blog again.如果我再次找到博客,我会更新答案。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM