简体   繁体   中英

How to shrink large python package for AWS Lambda Layer?

I'm trying to deploy a python package with large binary dependencies such as numpy , scipy , astropy , pandas , etc... The zip file is more than 400MB, so I have to shrink his size below 250MB to be able to deploy it to a lambda layer from S3.

I know I can delete tests, docs and pycache directories with something like

find -name "tests" -type d | xargs rm -rf
find -name "docs" -type d | xargs rm -rf
find -name "__pycache__" -type d | xargs rm -rf

but this is not sufficient...

Is it theoretically possible to delete *.pyc and *.so files? I know that, for example, the serverless-framework with serverless-python-requirements offer the slim option to package dependencies removing all *.pyc and *.so files. But I think Lambda environment needs these files... I don't want to use any framework if possible and I can run eventually docker to build binaries from a lambda compatible image (I'm working from WSL now). I tried many things but they are not working, I'm not sure why...

I really appreciate any help to understand what it's possible and what it's not to deploy such a huge package to Lambda...

As a workaround when I've run into this, I did what @jordanm suggested and switched to a Fargate task.

A simpler solution as of December 2020 is to use acontainer image for your Lambda.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM