简体   繁体   中英

Dynamic consumption of messages from a kafka topic

I need to consume the messages from the topics that are created dynamically by the producer. I have used the topic pattern approach in the consumer @KafkaListener(topicPattern = "topicname_.*" for that and have also set the metadata.max.age.ms=3000. But apparently unless and until i set the offset.auto.reset to earliest i am not able to achieve that. In our requirement the offset.auto.reset has to be set to latest to avoid the duplication of the issue.

Any ideas on how to achieve the same?

Speaking at top level, The design philosophy of Kafka doesn't allow anything like this because adding topics at runtime causes hard rebalance on the brokers and hence it has to be avoided, But if we manage to restart the consumer groups every time a new topic is added to the list, We can gracefully avoid this hazard.

The most recent Spring-Kafka integration and its usage includes the annotation @KafkaListener which turns your POJO listener to a Kafka consumer by creating a KafkaListenerContainer using the container factory passed to it. This consumer listens to the topics hardcoded as a string array of topics, topic expressions, topic patterns etc. This limits our design to fetching these topics as keys through a Java DSL at max instead of directly hardcoding. However the keys are still hardcoded within an array passed as a parameter to @KafkaListener .

Example:

@KafkaListener(topics = {“${kafka.topics.receipt.cancel.name}”}, containerFactory = “kafkaContainerFactory”)

Note: The value for annotation attribute KafkaListener.topics must be an array initializer, Therefore the hardcoding is mandatory.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM