I have 10 lambda functions for a project and one router lambda that copies the file from SFTP and place it in S3 then S3 triggers the lambda function according to the type of the file.
My scenario is to maintain the order of the file events for one type of file.
I thought to use SQS-FIFO to queue the S3 events but later found that SQS-FIFO is not supporting S3 to process the events in Order they pushed. What is best alternative solution for this?
It appears your situation is:
I would recommend that, rather than having Amazon S3 send a message to the Amazon SQS queue, you have your 'router lambda' do it instead. Currently, the 'router lambda' is copying the file "according to the type of file". You just need to add one more command that sends a message to the FIFO SQS queu e with details of the object that was copied. This takes the place of having S3 send the message to the SQS queue.
Thus, you would have:
The 'router lambda' would:
The SQS FIFO queue will trigger the appropriate Lambda function to process the file. Note that multiple files can be passed to a single Lambda invocation, so make sure to iterate through the events['Records']
array. Or, if you only want to process one file per invocation, set the Lambda function's batch size to 1.
See also: New for AWS Lambda – SQS FIFO as an event source | AWS Compute Blog
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.