简体   繁体   中英

AWS lambda Unable to import module 'lambda_function': No module named PIL

I am using a lambda function of SearchFacesbyimage And I am using this doc https://aws.amazon.com/blogs/machine-learning/build-your-own-face-recognition-service-using-amazon-rekognition/

where for comparison I am using this

from PIL import Image

And I am getting this error Unable to import module 'lambda_function': No module named PIL

You are getting this error as PIL for Python 2.x or PILLOW for 3.x are not standard libraries available in python lambda environment.

To use such a library , you have to make a custom deployment package of all libraries you need as well as the python code you want to deploy. This package can be made easily either in docker or by using EC2 instance .

here is the procedure how you will make that deployment package on EC2 :

  1. Suppose you have your file named CreateThumbnail.py

  2. If your source code is on a local host, copy it over to EC2.

    scp -i key.pem /path/to/my_code.py ec2-user@public-ip-address:~/CreateThumbnail.py

  3. Connect to a 64-bit Amazon Linux instance via SSH.

    ssh -i key.pem ec2-user@public-ip-address

  4. Install Python 3.6 and virtualenv using the following steps:

    a) sudo yum install -y gcc zlib zlib-devel openssl openssl-devel

    b) wget https://www.python.org/ftp/python/3.6.1/Python-3.6.1.tgz

    c) tar -xzvf Python-3.6.1.tgz

    d) cd Python-3.6.1 && ./configure && make

    e) sudo make install f sudo /usr/local/bin/pip3 install virtualenv

  5. Choose the virtual environment that was installed via pip3

    /usr/local/bin/virtualenv ~/shrink_venv

    source ~/shrink_venv/bin/activate

  6. Install libraries in the virtual environment

    pip install Pillow

    pip install boto3

  7. Add the contents of lib and lib64 site-packages to your .zip file. Note that the following steps assume you used Python runtime version 3.6. If you used version 2.7 you will need to update accordingly.

    cd $VIRTUAL_ENV/lib/python3.6/site-packages

    zip -r9 ~/CreateThumbnail.zip

    note- To include all hidden files, use the following option:

    zip -r9 ~/CreateThumbnail.zip

  8. Add your python code to the .zip file

    cd ~

    zip -g CreateThumbnail.zip CreateThumbnail.py

Now CreateThumbnail.zip is your custom deployment package , just copy it to s3 and upload it to your lambda.

This example is taken from official AWS documentation at https://docs.aws.amazon.com/lambda/latest/dg/with-s3-example-deployment-pkg.html

Even though the documentation clearly outlines the steps used to manually create the zip artifact for your lambda function. This solution is not very scalable. I've been using a very small package called juniper to seamlessly package python lambda functions.

In your particular case this are the steps you need to take:

Assuming this is your folder structure:

.
├── manifest.yml
├── src
│   ├── requirements.txt
│   ├── lambda_function.py

In the requirements.txt you would include only the dependencies of your lambda function, in this case, the PIL library.

Pillow==6.0.0

Now, you just have to create a small file to tell juniper what to include in the zip file. The manifest.yml would look like:

functions:
  reko:
    requirements: ./src/requirements.txt.
    include:
    - ./src/lambda_function.py

Now you need to pip install juniper in your local environment. Execute the cli command:

juni build

Juniper will create: ./dist/reko.zip . That file will have your source code as well as any dependency you include in your requirements.txt file.

By default juniper uses docker containers and the build command will use python3.6. You can override that default.

I also ran into this exact same problem. There are two steps that you can take here: manual versus automated packaging and deploying.

The manual step would involve creating the correct virtualenv and install dependencies in that virtual environment. Then zip everything and upload to AWS.

To automate stuff, I always prefer to use the Serverless framework to package and deploy Lambda functions. Specifically the python-requirements-plugin helps with packaging. But I do have to specify the following things to tell the framework to build within a docker container and don't strip any libraries:

custom:
  pythonRequirements:
    dockerizePip: true
    strip: false

As most of the answers here already allude to, AWS Lambda execution environment includes only the Python built-in packages and boto3, but nothing else.

To include external packages you need to include them yourself, either by building them and including it in your function upload -- or by packagaging them as layers. Also remember that the packages themselves need to be built for Amazon Linux.

If you're using python3.7, then you can use this publicly available layer for pillow: https://github.com/keithrozario/Klayers

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM