简体   繁体   中英

Local access to Amazon S3 Bucket from EC2 instance

I have an EC2 instance and an S3 bucket in the same region. The bucket contains reasonably large (5-20mb) files that are used regularly by my EC2 instance.

I want to programatically open the file on my EC2 instance (using python). Like so:

file_from_s3 = open('http://s3.amazonaws.com/my-bucket-name/my-file-name')

But using a "http" URL to access the file remotely seems grossly inefficient, surely this would mean downloading the file to the server every time I want to use it.

What I want to know is, is there a way I can access S3 files locally from my EC2 instance, for example:

file_from_s3 = open('s3://my-bucket-name/my-file-name')

I can't find a solution myself, any help would be appreciated, thank you.

Whatever you do the object will be downloaded behind the scenes from S3 into your EC2 instance. That cannot be avoided.

If you want to treat files in the bucket as local files you need to install any one of several S3 filesystem plugins for FUSE (example : s3fs-fuse ). Alternatively you can use boto for easy access to S3 objects via python code.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM