简体   繁体   中英

How to deploy huge python dependencies and ML model in AWS Lambda

I am trying to deploy an image classification algorithm on AWS lambda, and my build size is 390 MB which exceeds the upper limit of 250 MB even through S3 bucket upload. The reason for such a big size is: OpenCV (120 MB), Model Checkpoint (112 MB) and one more Caffemodel (125 MB)

Below is my question:

  1. How to deal with such a scenario when we have to deploy scripts with dependencies like OpenCV etc. Is there any way to deal with situations which are worse like 1 GB of zip size

Can you have your script copy the dependencies over from an s3 bucket? You have 512MB of /tmp. You would want to do this outside of your function, potentially wrapped in some logic so that it only happened once per container lifecycle, not on every invocation. You would have to push imports after the file copy in the script.

eg

if not {OpenCV installed}:
    {Use boto to copy files from S3}

import /tmp/OpenCV

def handler(event, context):
    {Your script}

In the current AWS world when you exceed this 512MB threshold, you may have to move to something like FarGate in lieu of Lambda.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM