简体   繁体   中英

Extract result from Spark Stream into Java Object

Currently I have integrated Spark Stream with Kafka in Java and able to aggregate the stats. However, I cannot figure out a way to store the result into a Java object so I can pass this object with the result around with different methods/classes without storing them into database. I have spent quite amount of time searching for tutorial/examples online but all of them are end up with using print() to display the result in console. However, what I am trying to do is to return these results JSON string when users call a rest-api endpoint.

Is it possible that I can have these results in memory and pass them around with different methods, or I need to store them into database first and fetch them from there as needed?

If I got you right you want consume your results from Spark Streaming via Rest APIs.

Even if there are some ways to directly accomplish this (eg using Spark SQL/Thrift server) I would separate these two tasks. Else if you're Spark Streaming process fails, your service/REST-API layer will fail too.

Thus it has its advantages to separate these two layers. You are not forced to use a classical database. You could implement a service, which implements/uses JCache and send your results of the Spark streaming process to it.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM