简体   繁体   中英

Kafka client and aggregated events

In event-driven design we strive to find out events that we interested of. Using Kafka we can easily subscribe (a new group.id ) to a topic and start consuming events. If retention policy is default one we could consume also one week old messages if specify auto.offset.reset=earliest . Right? But what if we want to start from the very beginning ? I guess that KTable should be used but I'm not sure what will happened when a new client has subscribed to a stateful stream. Could you tell me is it true that the new subscriber will receive all aggregated messages?

You can't consume data that has been deleted.

That's why KTables are built on top of compacted topics, which will store the latest keys for each record, and have infinite retention.

If you want to read the "current state" of the table, to get all aggregated messages, then you can use Interactive Queries .

not sure what will happened when a new client has subscribed to a stateful stream

It needs to read the entire compacted topic, starting from the beginning (earliest available offset, not necessarily the first ever produced message) since it cannot easily find where in the topic that each unique key may start.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM