简体   繁体   中英

Managing large AWS SQS payloads over S3

I am working in a project Java 11/Spring boot project that I need to send and consume a SQS message that is over 256KB, that is the common limit for SQS. I can't change the modeling of the system in a way that the message will be lesser than 256KB.

I know AWS provides support for bigger payloads by using its SQS extended client library, that can be seen here: https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-s3-messages.html#working-java-example-using-s3-for-large-sqs-messages-example

I copied and tested the example for sending the message, but still not sure about his behaviour on Spring Boot integration (@SqsListener) for consuming these kind of messages. The code worked successfuly, but I am not sure if the payload has been already deleted inside S3 bucket after consumption because I couldn't see the message been stored in there. In the example, deleting the message needs to be done manually, but I haven't coded it when I ran the code.

Do Spring Boot @SqsListener consumer already manages to delete the message, after consumption, and makes evething ready or I need to manage something yet?

I found the answer by debugging aws sdk code. I will put it here for anyone that needs this in the future.

I initially thought that the message was big enough, because I thought the encoding for char was taking 2 bytes per char, but by analizing the method, from aws, Util.getStringSizeInBytes, it was taking only 1 byte.

If you check the code from AmazonSQSExtendedClient.sendMessage, you will see that there is going to have this comaparison:

                if (this.clientConfiguration.isAlwaysThroughS3() || this.isLarge(sendMessageRequest)) {
                sendMessageRequest = this.storeMessageInS3(sendMessageRequest);
            }

Then I saw that this.isLarge and isAlwaysThroughS3 was both false, so it wasn't really using the S3 bucket at all. Then, for testing, I incresed the payload until isLarge change to true and then I could see the message in the S3 bucket.

One good tip also, is that when S3 is under use, the message sends just the S3 key on SQS.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM