简体   繁体   中英

Parse large Jsons from Rest-API

i am facing the problem of parsing large json-results from a rest-endpoint (elasticsearch).

besides the design of the system has got its flaws, I am wondering whether there is another way to do the parsing.

The rest-response contains 10k Object in Json-Array. I am using the native Json-mapper of elasticsearch and Jsoniter. Both lack performance and slow down the application. The request duration raises up to 10-15 sec.

I will encourage a change of the interface but the big result list will remain for the next 6 month.

Could anyone give me an advice what to do to speed up the performance with elasticsearch?

Profile everything.

Is Elasticsearch slow in generating the response?
If you perform the query with Curl, redirect the output to a file, and time it, what fraction of your app's time taken does that take?

Are you running it locally? You might be dropping packets/being throttled by low bandwidth over the network.

Is the performance hit is purely decoding the response? How long does it take to decode the same blob of JSON using Jsoniter once loaded into memory from a static file?

Have you considered chunking your query?
What about spinning it off as a separate process and immediately returning to the event loop?

There are lots of options and not enough detail in your question to be able to give solid advice.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM