简体   繁体   中英

Why concurrent AWS Lambda (inside VPC) execution is not complaining about IP limit in VPC?

We did quick experiment on AWS lambda to confirm concurrent execution limit based on our VPC IP limit? In our AWS account VPC has around 500 IPs available. In general, if AWS lambda is running inside VPC, available IPs can be exhausted if we have more concurrent lambda's running than available IPs. Below are the experiment details and Lambda functions.

We wrote a lambda caller (refer#1 below) function which invokes lambda called function (refer#2) configured inside VPC. We invoked a lambda called function around 999 times and made sure that all these should run concurrently. But surprisingly all lambda's finished without any complain.

The biggest question is, if we had 500 IP limit in our VPC and we ran lambda 999 times inside VPC, why we didn't get IP availability issue? Any idea?

1. Lambda Caller Function (Node.js 10.x)

const AWS = require('aws-sdk');
const lambda = new AWS.Lambda();

const duration = 3 * 60 * 1000;

exports.handler = async (event) => {

    try {

        const lambdaPosterParams = {
            FunctionName: 'testCalledFunction',
            InvocationType: 'Event',
            LogType: 'None'
        };

        for (var invokationNumber=0; invokationNumber<999; invokationNumber++) {
            console.log("Invoking lambda #" + invokationNumber);
            lambdaPosterParams.Payload = JSON.stringify({
                'invokationNumber': invokationNumber,
                'duration': duration,
                'tableName': 'testload2'
            });
            const posterResponse = await lambda.invoke(lambdaPosterParams).promise();
            console.log("Poster Lambda invoked", JSON.stringify(posterResponse));
        }
    } catch (error){
        console.error('Error invoking lambda #' + invokationNumber, error);
        throw error;

    }
    console.log("All lambdas invoked");
    const response = {
        statusCode: 200,
        body: JSON.stringify('Hello from Lambda!'),
    };
    return response;
};

2. Lambda Called Function (Node.js 10.x)

const AWS = require('aws-sdk');

const dbConnection = new AWS.DynamoDB({region: process.env.AWS_REGION, apiVersion: process.env.AWS_DYNAMODB_API_VERSION});

exports.handler = async (event) => {

    const startTime = new Date();
    const insertData = {
        'TableName': 'testload',
        'Item': {
            'invokationNumber': {'N': event.invokationNumber.toString()},
            'startTime': {'S': startTime.toUTCString()},
        }
    };

    await dbConnection.putItem(insertData).promise();


    console.log(`Event #${event.invokationNumber}. Sleeping...`);

    await timeout(3 * 60 * 1000);

    console.log('Waking up...');

    const endTime = new Date();
    insertData.Item.endTime = {'S': endTime.toUTCString()};
    insertData.Item.duration = {'N': (endTime.getTime() - startTime.getTime()).toString()};

    await dbConnection.putItem(insertData).promise();

    const response = {
        statusCode: 200,
        body: JSON.stringify('Hello from Lambda!'),
    };

    return response;

    function timeout(ms) {
        return new Promise(resolve => setTimeout(resolve, ms));
    }
};

ENIs are not needed at a level of 1 per concurrent invocation, unless your function is configured to be allocated the maximum allowable 3 GB of memory per invocation.

At 1.5 GB you can have two concurrent invocations, at 1 GB it's three, at 512 MB it's six and at 128 MB it's approximately 24 concurrent invocations per Elastic Network Interface (ENI).

Approximately.

This is because your containers are allocated on m-class EC2 instances (or something extremely comparable), one ENI per instance , and each instance has 3 GB of usable memory available for containers. The smaller the memory assigned to the function, the more containers per (hidden, managed) EC2 instance, so the fewer ENIs are required for a given level of concurrency.

If your Lambda function accesses a VPC, you must make sure that your VPC has sufficient ENI capacity to support the scale requirements of your Lambda function. You can use the following formula to approximately determine the ENI requirements.

Projected peak concurrent executions * (Memory in GB / 3GB)

https://docs.aws.amazon.com/lambda/latest/dg/vpc.html

Configure your function for 3 GB and your original expectations should be confirmed -- you'll run out of ENIs, either due to a lack of IP addresses, or to your account limit for maximum number of ENIs in the region.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM