简体   繁体   English

为什么并发 AWS Lambda(在 VPC 内)执行没有抱怨 VPC 中的 IP 限制?

[英]Why concurrent AWS Lambda (inside VPC) execution is not complaining about IP limit in VPC?

We did quick experiment on AWS lambda to confirm concurrent execution limit based on our VPC IP limit?我们对 AWS lambda 进行了快速实验,以根据我们的 VPC IP 限制确认并发执行限制? In our AWS account VPC has around 500 IPs available.在我们的 AWS 账户中,VPC 有大约 500 个可用 IP。 In general, if AWS lambda is running inside VPC, available IPs can be exhausted if we have more concurrent lambda's running than available IPs.一般来说,如果 AWS lambda 在 VPC 内运行,如果我们运行的并发 lambda 比可用 IP 多,则可用 IP 可能会耗尽。 Below are the experiment details and Lambda functions.下面是实验细节和 Lambda 函数。

We wrote a lambda caller (refer#1 below) function which invokes lambda called function (refer#2) configured inside VPC.我们编写了一个 lambda 调用者(下面的引用#1)函数,它调用在 VPC 中配置的 lambda 调用函数(引用#2)。 We invoked a lambda called function around 999 times and made sure that all these should run concurrently.我们调用了一个名为 function 的 lambda 大约 999 次,并确保所有这些都应该同时运行。 But surprisingly all lambda's finished without any complain.但令人惊讶的是,所有 lambda 都没有任何抱怨地完成了。

The biggest question is, if we had 500 IP limit in our VPC and we ran lambda 999 times inside VPC, why we didn't get IP availability issue?最大的问题是,如果我们的 VPC 中有 500 个 IP 限制并且我们在 VPC 内运行 lambda 999 次,为什么我们没有遇到 IP 可用性问题? Any idea?任何的想法?

1. Lambda Caller Function (Node.js 10.x) 1. Lambda 调用函数(Node.js 10.x)

const AWS = require('aws-sdk');
const lambda = new AWS.Lambda();

const duration = 3 * 60 * 1000;

exports.handler = async (event) => {

    try {

        const lambdaPosterParams = {
            FunctionName: 'testCalledFunction',
            InvocationType: 'Event',
            LogType: 'None'
        };

        for (var invokationNumber=0; invokationNumber<999; invokationNumber++) {
            console.log("Invoking lambda #" + invokationNumber);
            lambdaPosterParams.Payload = JSON.stringify({
                'invokationNumber': invokationNumber,
                'duration': duration,
                'tableName': 'testload2'
            });
            const posterResponse = await lambda.invoke(lambdaPosterParams).promise();
            console.log("Poster Lambda invoked", JSON.stringify(posterResponse));
        }
    } catch (error){
        console.error('Error invoking lambda #' + invokationNumber, error);
        throw error;

    }
    console.log("All lambdas invoked");
    const response = {
        statusCode: 200,
        body: JSON.stringify('Hello from Lambda!'),
    };
    return response;
};

2. Lambda Called Function (Node.js 10.x) 2. Lambda 调用函数(Node.js 10.x)

const AWS = require('aws-sdk');

const dbConnection = new AWS.DynamoDB({region: process.env.AWS_REGION, apiVersion: process.env.AWS_DYNAMODB_API_VERSION});

exports.handler = async (event) => {

    const startTime = new Date();
    const insertData = {
        'TableName': 'testload',
        'Item': {
            'invokationNumber': {'N': event.invokationNumber.toString()},
            'startTime': {'S': startTime.toUTCString()},
        }
    };

    await dbConnection.putItem(insertData).promise();


    console.log(`Event #${event.invokationNumber}. Sleeping...`);

    await timeout(3 * 60 * 1000);

    console.log('Waking up...');

    const endTime = new Date();
    insertData.Item.endTime = {'S': endTime.toUTCString()};
    insertData.Item.duration = {'N': (endTime.getTime() - startTime.getTime()).toString()};

    await dbConnection.putItem(insertData).promise();

    const response = {
        statusCode: 200,
        body: JSON.stringify('Hello from Lambda!'),
    };

    return response;

    function timeout(ms) {
        return new Promise(resolve => setTimeout(resolve, ms));
    }
};

ENIs are not needed at a level of 1 per concurrent invocation, unless your function is configured to be allocated the maximum allowable 3 GB of memory per invocation.每次并发调用不需要 1 个级别的 ENI,除非您的函数配置为每次调用分配的最大允许 3 GB 内存。

At 1.5 GB you can have two concurrent invocations, at 1 GB it's three, at 512 MB it's six and at 128 MB it's approximately 24 concurrent invocations per Elastic Network Interface (ENI).在 1.5 GB 时,您可以有两个并发调用,1 GB 时为 3 个,512 MB 时为 6 个,128 MB 时每个弹性网络接口(ENI) 大约为 24 个并发调用。

Approximately.大约。

This is because your containers are allocated on m-class EC2 instances (or something extremely comparable), one ENI per instance , and each instance has 3 GB of usable memory available for containers.这是因为您的容器分配在 m 级 EC2 实例(或非常相似的实例)上,每个实例一个 ENI,并且每个实例都有 3 GB 的可用内存可用于容器。 The smaller the memory assigned to the function, the more containers per (hidden, managed) EC2 instance, so the fewer ENIs are required for a given level of concurrency.分配给函数的内存越小,每个(隐藏的、托管的)EC2 实例的容器越多,因此给定的并发级别所需的 ENI 越少。

If your Lambda function accesses a VPC, you must make sure that your VPC has sufficient ENI capacity to support the scale requirements of your Lambda function.如果您的 Lambda 函数访问 VPC,您必须确保您的 VPC 具有足够的 ENI 容量来支持您的 Lambda 函数的规模要求。 You can use the following formula to approximately determine the ENI requirements.您可以使用以下公式来大致确定 ENI 要求。

Projected peak concurrent executions * (Memory in GB / 3GB)

https://docs.aws.amazon.com/lambda/latest/dg/vpc.html https://docs.aws.amazon.com/lambda/latest/dg/vpc.html

Configure your function for 3 GB and your original expectations should be confirmed -- you'll run out of ENIs, either due to a lack of IP addresses, or to your account limit for maximum number of ENIs in the region.将您的函数配置为 3 GB 并确认您最初的期望——您将用完 ENI,原因可能是缺少 IP 地址,或者您的帐户限制了该区域中 ENI 的最大数量。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM