简体   繁体   中英

Unable to use pyodbc with aws lambda and API Gateway

I am trying to build a AWS Lambda function using APi Gateway which utlizes pyodbc python package. I have followed the steps as mentioned in the documentation. I keep getting the following error Unable to import module 'app': libodbc.so.2: cannot open shared object file: No such file or directory when I test run the Lambda function.

Any help appreciated. I am getting the same error when I deployed my package using Chalice. It seems it could be that I need to install unixodbc-dev. Any idea how to do that through AWS Lambda?

Simply untar this file from here - github - lambda_packages/pyodbc . This has .so files in it.

Now package your python code and the .so files together and upload to AWS lambda. The folder structure for your reference should look like this.

lambda_function.py
libodbc.so.2
pyodbc.so
<name_this_zip>.zip

No subfolders exist

Edit: Created a lambda layer and saved it for reuse. Get it here - https://github.com/kuharan/Lambda-Layers

pyodbc uses some native libs. Therefore you cannot just copy the contents of your site-packages to Lambda, as your OS is likely not Amazon Linux.

So you need to install pyodbc on an Amazon Linux instance and use the generated libs:

https://docs.aws.amazon.com/lambda/latest/dg/lambda-python-how-to-create-deployment-package.html

Or you can get from here, if available there:

https://github.com/Miserlou/lambda-packages

try running this script to collect dependency into s3 bucket and then adding it to your lambda deployment package.

""" 
This lambda function collects python pip dependencies, and uploads them to S3 bucket 
as a single tar.gz file. Example input for Lambda event: 
    event = {
        "prefix"            : "myPackage",
        "saveToS3Bucket"    : "my-s3-bucket",
        "saveToS3Key"       : "package-jwt.tar.gz",
        "requirements"      : [ "cryptography==2.1.3",
                                "PyJWT==1.5.3" ]
    }

Minimal Lambda execution role:
    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Sid": "Stmt1507151548000",
                "Effect": "Allow",
                "Action": [
                    "s3:PutObject"
                ],
                "Resource": [
                    "arn:aws:s3:::my-s3-bucket/package-jwt.tar.gz"
                ]
            },
            {
                "Effect": "Allow",
                "Action": [
                    "logs:CreateLogGroup",
                    "logs:CreateLogStream",
                    "logs:PutLogEvents"
                ],
                "Resource": "*"
            }
        ]
    }
"""

from subprocess import check_output
import uuid
import boto3

DEBUG_OUT_FILE = "/tmp/debug.txt"
S3 = boto3.resource('s3')


def lambda_handler(event, context):
    """ 
    """

    requirements = event.get('requirements', [])
    prefix = event.get('prefix', 'myPackage')
    saveToS3Bucket = event.get('saveToS3Bucket', None)
    saveToS3Key = event.get('saveToS3Key', None)
    location = "%s_%s" % (prefix, uuid.uuid4())
    destinationPath = '/tmp/%s' % location
    tarFileName = '/tmp/%s.tar.gz' % location

    for req in requirements:
        _exec(['pip', 'install', req, '-t', destinationPath])

    _exec(['tar', 'czvf', tarFileName, destinationPath])
    _persist_file_to_s3(tarFileName, saveToS3Bucket, saveToS3Key)
    return 'done!'


def _exec(statements):
    if statements and type(statements) == list:
        with open(DEBUG_OUT_FILE, "a") as f:
            try:
                f.write("\n$ %s \n" % " ".join(statements))
                rv = check_output(statements).decode("utf8")
                f.write(rv)
                print(rv)
            except Exception as ex:
                print(ex)
                f.write(str(ex))


def _persist_file_to_s3(filePathToUpload, s3Bucket, s3Key):
    if filePathToUpload and s3Bucket and s3Key:
        S3.meta.client.upload_file(filePathToUpload, s3Bucket, s3Key)

Fisrt, install unixODBC and unixODBC-devel packages using yum install unixODBC unixODBC-devel . This step will install everything required for pyodbc module.

The library you're missing is located in /usr/lib64 folder on you Amazon Linux instance. Copy the library to your python project's root folder (libodbc.so.2 is just a symbolic link, make sure you copy symbolic link and library itself as listed): libodbc.so , libodbc.so.2 and libodbc.so.2.0.0

@joarleymoraes answer is correct.

Each Lambda instance under the hood is a container created from Amazon Linux AMI. pyodbc needs some native libraries to work which are not present by default inside the Lambda container. So to make things work, you need to ensure that the Lambda environment includes these native libraries in addition to pyodbc and your function code.

See https://medium.com/@narayan.anurag/breaking-the-ice-between-aws-lambda-pyodbc-6f53d5e2bd26 to understand more about the problem and the solution.

I resolved this by adding pyodbc git package as a lambda layer and set the python version to 3.7 sample code

conn = pyodbc.connect("DRIVER={0};SERVER={1};DATABASE={2};UID={3};PWD={4}".format(driver, server, database, username, password))

you can download the pyodbc layer from the GIT https://github.com/karthigces/lambda_layers

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM