简体   繁体   中英

How can I make files on s3 bucket mounted to aws ec2 instance using goofys available to aws lambda function?

I've mounted a public s3 bucket to aws ec2 instance using Goofys (kind of similar to s3fs), which will let me access files in the s3 bucket on my ec2 instance as if they were local paths. I want to use these files in my aws lambda function, passing in these local paths to the event parameter in aws lambda in python. Given that AWS lambda has a storage limit of 512 MB, is there a way I can give aws lambda access to the files on my ec2 instance?

AWS lambda really works well for my purpose (I'm trying to calculate a statistical correlation between 2 files, which takes 1-1.5 seconds), so it'd be great if anyone knows a way to make this work.

Appreciate the help.

EDIT:

In my AWS lambda function, I am using the python library pyranges, which expects local paths to files.

In my AWS lambda function, I am using the python library pyranges, which expects local paths to files.

You have a few options:

  • Have your Lambda function first download the files locally to the /tmp folder, using boto3 , before invoking pyranges.
  • Possibly use S3Fs to emulate file handles for S3 objects.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM