简体   繁体   English

卡夫卡从主题中读取新旧价值

[英]Kafka reading old and new value from topic

We have one producer-consumer environment, we are using Spring Boot for our project.我们有一个生产者-消费者环境,我们正在为我们的项目使用 Spring Boot。 Kafka configuration was done by using class使用 class 完成 Kafka 配置

@Configuration
@EnableKafka
public class DefaultKafkaConsumerConfig {

    @Value("${spring.kafka.bootstrap-servers}")
    private String bootstrapServers;

    @Value("${spring.kafka.bootstrap-servers-group}")
    private String bootstrapServersGroup;


    @Bean
    public ConsumerFactory<String,String> consumerDefaultFactory(){
        Map<String, Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, IntegerDeserializer.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
     
        props.put(ConsumerConfig.GROUP_ID_CONFIG, bootstrapServersGroup);
        props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "latest");
        props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, true);
     
        return new DefaultKafkaConsumerFactory<>(props);
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerDefaultContainerFactory() {
        ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerDefaultFactory());
        return factory;
    }


}

SCENARIO : We are writing some values on Kafka topics.场景:我们正在写一些关于 Kafka 主题的值。 Consider we have some topic where we are putting live data.考虑一下我们有一些要放置实时数据的主题。 Which have status like "live:0" for completed event and "live:1" for live event.其中状态为“live:0”表示已完成事件,“live:1”表示实时事件。 Now when event going to be live it will get update and write on topic, and depending on this topic we are processing event.现在,当事件将要上线时,它将得到更新并写入主题,并且根据这个主题,我们正在处理事件。

ISSUE : When event get live I read data from topic with "live:1" and processed. ISSUE :当事件上线时,我从“live:1”主题中读取数据并进行处理。 But when event got updated and new data updated in topic.但是当事件更新并且主题中的新数据更新时。 Here now when new data updated on topic I am able to read those data.现在,当有关主题的新数据更新时,我可以读取这些数据。 But with new data on topic, I am receiving old data too.但是有了关于主题的新数据,我也收到了旧数据。 Because I am getting both old and new data same time my event got affected.因为在我的事件受到影响的同时,我同时获得了新旧数据。 Some time it goes live some time in completed.有的时候它会上线,有的时候完成。

Anyone give any suggestions here on this?有人对此提出任何建议吗? Why I am getting committed data and newly updated data?为什么我会收到已提交的数据和新更新的数据? Any thing I am missing here in configuration?我在配置中缺少什么?

you may want to check the couple of things: -1.您可能需要检查以下几件事:-1。 number of partitions 2. number of consumer分区数 2. 消费者数

does it also means that you are re-writing the consume message to topic again, with new status?这是否也意味着您正在以新的状态再次将消费消息重写到主题?

try {
  ListenableFuture<SendResult<String, String>> futureResult = this.kafkaTemplate.send(topicName, message);
  futureResult.addCallback(new ListenableFutureCallback<SendResult<String, String>>() {

    @Override
    public void onSuccess(SendResult<String, String> result) {
        log.info("Message successfully sent to topic {} with offset {} ", result.getRecordMetadata().topic(), result.getRecordMetadata().offset());
    }

    @Override
    public void onFailure(Throwable ex) {
        FAILMESSAGELOGGER.info("{},{}", topicName, message);
        log.info("Unable to send Message to topic {} due to ", topicName, ex);
    }
      
  });
} catch (Exception e) {
  log.error("Outer Exception occured while sending message  {} to topic {}", new Object[] { message, topicName, e });
  FAILMESSAGELOGGER.info("{},{}", topicName, message);
} 

This what we have.这就是我们所拥有的。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM