简体   繁体   中英

How to batch the iot data in aws sqs and then store data in aws s3

I'm trying to store the iot data in aws s3 bucket but the size of data is very small around 1-3kb it makes no sense to store such small data in s3, I want to use aws sqs service (fifo queue) to keep data for around 15-30 minutes in the queue and then send the batched json data to s3 bucket and also want to do some calculations on the batched data before it gets stored in aws s3 bucket.

Want to create a flow like this:- aws iot core -> aws iot core rules (to sqs) -> aws sqs (retain data in it for 15/30 mins) -> s3 bucket

the payload looks like this

{
     "sensor_name" : "XXXX",
     "temp" : 33.45,
     "humidity" : 0.20
     "timestamp" : epochtime (added from aws iot rules query)
}

Also the data should be stored in such format in the s3 after the batching of the data -> sensor_name/yyyy/mm/dd

So i can query the data with relative ease using aws athena to generate csv as and when required.

Tried aws kinesis but it is bit out of budget for me at this moment it costs a bit more. Any help would be greatly appreciated

You can create a EventBridge rule to trigger a Lambda function every 15/30 min which reads all the data present in the queue, process the data and stores it in S3 in some aggregated format.

Some time ago I used this approach to implement an image capture app, where each frame was published as an SQS message. Every hour a Lambda function read the frames in the queue and created a time-lapse video which was then store in S3.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM