简体   繁体   中英

Access to Amazon S3 Bucket from EC2 instance

I have an EC2 instance and an S3 bucket in different region. The bucket contains some files that are used regularly by my EC2 instance. I want to programatically download the files on my EC2 instance (using python)

Is there a way to do that?

Lots of ways to do this from within python

Boto has S3 modules which will do this. http://boto.readthedocs.org/en/latest/ref/s3.html

You could also just use the python requests library to download over http

AWS Cli also give you an option to download from the shell:

aws s3 cp s3://bucket/folder/file.name file.name

Adding to what @joeButler has said above...

Your instances need permission to access S3 using APIs. So, you need to create IAM role and instance profile. Your instance needs to have instance profile assigned when it is being created. See page 183 (as indicated on bottom of page. The topic name is "Using an IAM Role to Grant Permissions to Applications Running on Amazon EC2 Instances") of this guide: AWS IAM User Guide to understand the steps and procedure.

I work for Minio, its open source, S3 Compatible object storage written in golang.

You can use minio-py client library, its open source & compatible with AWS S3. Below is a simple example of get_object.py

from minio import Minio
from minio.error import ResponseError

client = Minio('s3.amazonaws.com',
               access_key='YOUR-ACCESSKEYID',
               secret_key='YOUR-SECRETACCESSKEY')

# Get a full object
try:
    data = client.get_object('my-bucketname', 'my-objectname')
    with open('my-testfile', 'wb') as file_data:
        for d in data:
            file_data.write(d)
except ResponseError as err:
    print(err)

You can also use minio client aka mc it come mc mirror command to perform the same. You can add it to cron.

$ mc mirror s3/mybucket localfolder

Note:

  • s3 is an alias
  • mybucket is your AWS S3 bucket
  • localfolder is EC2 machine file for backup.

Installing Minio Client:

GNU/Linux

Download mc for:

$ ./mc config host add mys3 https://s3.amazonaws.com BKIKJAA5BMMU2RHO6IBB V7f1CwQqAcwo80UEIJEjc5gVQUSSx5ohQ9GSrr12

Note: Replace access & secret key with yours.

As mentioned above, you can do this with Boto. To make it more secure and not worry about the user credentials, you could use IAM to grant the EC2 machine access to the specific bucket only. Hope that helps.

If you want to use python, you may want to use the newer boto3 API. I personally like it more than to original boto package. It works with both python2 and python3 and the differences are minimal.

You can specify region when you create a new bucket (see boto3.Client documentation), but bucket names are unique, so you shouldn't need one to connect to it. And you probably don't want to use bucket in different region than your instance because you will pay for data transfer between regions .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM