简体   繁体   中英

Python boto3 upload file to S3 from ec2

If I want to upload a file from my "mac" to S3, I can use

import boto3
s3 = boto3.resource('s3', region_name="us-west-1")
s3.meta.client.upload_file('/User/gantao/amz.jpg', 'gantao_created', 'amz.jpg')

But,

Is there any way to upload a file to S3 from EC2 instance (do not use SSH to EC2) ?

The short answer is "yes."

The longer answer is that, in order to upload a file from an EC2 instance to S3, the deployed EC2 instance has to have the correct permissions to put_object to S3.

You can either do this by configuring your boto client inside your Python code, or by creating and assigning an IAM role with that permission to your EC2 instance.

The more secure way is to assign the IAM role, since it doesn't require you to have credentials in code or in a file in source control.

  1. Use the AWS console to create an IAM role with permission to 'put' objects to the S3 bucket in question.
  2. Assign the IAM role to your EC2 instance.

To create an IAM role:

https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/iam-roles-for-amazon-ec2.html

To add an IAM role to your EC2 instance:

https://aws.amazon.com/blogs/security/easily-replace-or-attach-an-iam-role-to-an-existing-ec2-instance-by-using-the-ec2-console/

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM