简体   繁体   中英

Spark KafkaUtils CreateRDD apply filter on key

I have a huge Kafka topic that contains several key,value messages. I want just to process data with a given key in an RDD, without having to download the whole topic. Data is also interleaved, so I cannot even rely on the offset in the topic Any suggestion on how to perform this?

Not possible. Need to filter them (all).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM