简体   繁体   中英

Loading modules into node.js AWS lambda dynamically from S3

I have managed to do this, but I'm not sure if this is the correct way. I need to load a module into a node.js (12.x) AWS lambda. There are many possible modules I can load and want to select them dynamically rather than including all in the lambda zip. So I do the following in the lambda:

  1. Load file from S3 and save to /tmp/my_module with fs.writeFile
  2. require("/tmp/my_module")

This works, but it seems a bit messy. Are there any alternatives? Ideally I would have liked to have used import() rather than require(), but I understand that it's not possible to enable this feature in AWS lambda.

AWS let us create " Layers ".

A layer is a ZIP archive that contains libraries, a custom runtime, or other dependencies. With layers, you can use libraries in your function without needing to include them in your deployment package.

A layer is a ZIP archive that contains libraries, a custom runtime, or other dependencies. With layers, you can use libraries in your function without needing to include them in your deployment package

However, layers need to be linked to the lambda at development time. Throught the AWS Cli, Lambda GUI or Serverless Framework.

They can be created with a zip or from a .zip in S3 bucket.

The structure for the NodeJS layer is as follows:

Node.js – nodejs/node_modules, nodejs/node8/node_modules (NODE_PATH)

Example AWS X-Ray SDK for Node.js

xray-sdk.zip
└ nodejs/node_modules/aws-xray-sdk

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM