简体   繁体   中英

Spilt list of data in small chunks and process it using Executor service java

My Requirement: I have 100k data, i need to split it into 1k dataset each and then process it(call REST API).

  1. What flow should I be using here? Spilt 10k data into 1k and then call executor service? I believe his approach might simply increase memory(while re-storing split data) or Split data and call REST API simultaneously?

  2. Should I use some Queue implementation?

  3. Is using Executor service the right approach?

  4. Is there any other extremely clean way of splitting huge data and call REST API?

Please Note: I am just looking for a nice "APPROACH/FLOW/DESIGN".

Can someone suggest some approach?

EDIT: I am fetching 100k Phone numbers from MongoDB. I need to split it into batch of 1k each and send message. If any error occurs it can be logged in logger files(no need to handle this).

This answer is targeted at your point 4:

I would suggest that you have a look at Apache Camel or Apache NiFi to process this structure and amount of data. If you don't know any of them yet, both systems will require that you get used to some new concepts. NiFi is the more modern one, while Camel can be configured via Java.

Both systems should make it quite easy to read data from Mongo, split them and send out the REST requests.

On the other hand this might be too much overhead, if this is a one-time-only job to get done. But it is useful if you need this job to run reliably over and over again.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM