简体   繁体   中英

AWS Lambda function not respecting batch size 1 when reading from SQS Queue

I recently set up a simple AWS Lambda function reading from an SQS FIFO queue. The jobs I am running in Lambda need to be ran concurrently at a specific time. I assumed that batch size 1 would enforce each job to be mapped to a separate Lambda instance when sent in bulk. However, I am seeing consecutive jobs sent to the queue being ran in the same instance and running serially, like a batch job would.

I expected setting batch size = 1 to force concurrent execution of jobs in the SQS queue, but see them running serially in Lambda.

Is there anything I'm missing about the batch size=1 parameter?

Edit:

Here's how I'm sending the jobs:

for req in reqs:
  rand_str = str(random.randint(1, 2**15))
  queue.send_message(
        MessageBody=json.dumps(req),
        MessageDeduplicationId=rand_str,
        MessageGroupId=rand_str
    )

and here's the lambda handler:

def handle_request(event, context):
    request = event['Records'][0]
    payload = json.loads(request["body"])
    b = Bot(payload)
    b.run()
    handle_result(b)

When we use FIFO queue of SQS, only one batch for one group is available. Use standard queue instead.

https://docs.aws.amazon.com/lambda/latest/dg/with-sqs.html#events-sqs-scaling

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM