I am trying to deploy an image classification algorithm on AWS lambda, and my build size is 390 MB which exceeds the upper limit of 250 MB even through S3 bucket upload. The reason for such a big size is: OpenCV (120 MB), Model Checkpoint (112 MB) and one more Caffemodel (125 MB)
Below is my question:
Can you have your script copy the dependencies over from an s3 bucket? You have 512MB of /tmp. You would want to do this outside of your function, potentially wrapped in some logic so that it only happened once per container lifecycle, not on every invocation. You would have to push imports after the file copy in the script.
eg
if not {OpenCV installed}:
{Use boto to copy files from S3}
import /tmp/OpenCV
def handler(event, context):
{Your script}
In the current AWS world when you exceed this 512MB threshold, you may have to move to something like FarGate in lieu of Lambda.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.