简体   繁体   English

如何配置 spring kafka 以启动 Kafka 生产者的其他实例

[英]How to Configure spring kafka to start additional instances of a Kafka producer

I have a KAFKA producer used in a request/reply configuration.我有一个在请求/回复配置中使用的 KAFKA 生产者。 When one instance of the producer is launched, it works perfect.当生产者的一个实例启动时,它可以完美运行。 However when launching a second instance of the producer, the seconds instance won't work.但是,当启动生产者的第二个实例时,第二个实例将不起作用。 It will write the message to the topic correctly, The consumer will process the message and sends the reply back, however the producer will not find the reply message it is waiting and it times out.它将消息正确地写入主题,消费者将处理消息并发送回复,但是生产者将找不到它正在等待的回复消息并且超时。 It appears that the message is been picked up by the first instance of the producer.消息似乎被生产者的第一个实例拾取。 Since the first instance is not expecting this reply message.由于第一个实例不期望此回复消息。 The request/reply message fails.请求/回复消息失败。 Is there any configuration missing to make the second instance work?是否缺少任何配置以使第二个实例工作? This POC is to be used in an Openshift POD so it should be able to scale to multiple producer and multiple consumer instances.此 POC 将在 Openshift POD 中使用,因此它应该能够扩展到多个生产者和多个消费者实例。 The following is my configuration for consumer and producer.以下是我对消费者和生产者的配置。 Thanks谢谢

Kafka Producer Config卡夫卡生产者配置

public class KafkaConfig {

    @Value("${spring.kafka.bootstrap-servers}")
    private String bootstrapServers;

    @Value("${spring.kafka.consumer.group-id}")
    private String groupId;

    @Value("${kafka.topic.request-reply-topic}")
    String requestReplyTopic;

    @Value("${kafka.request-reply.timeout-ms}")
    private Long replyTimeout;

    @Bean
    public Map<String, Object> consumerConfigs() {
        Map<String, Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
        props.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);

        return props;
    }

    @Bean
    public Map<String, Object> producerConfigs() {
        Map<String, Object> props = new HashMap<>();
        props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
       // props.put(ProducerConfig.RETRIES_CONFIG, 0);
       // props.put(ProducerConfig.BATCH_SIZE_CONFIG, 16384);
        //props.put(ProducerConfig.LINGER_MS_CONFIG, 1);
        //props.put(ProducerConfig.BUFFER_MEMORY_CONFIG, 33554432);
        props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);

        return props;
    }

    @Bean
    public ReplyingKafkaTemplate<String, InGetAccountInfo, AccountInquiryDto> replyKafkaTemplate(ProducerFactory<String, InGetAccountInfo> pf, KafkaMessageListenerContainer<String, AccountInquiryDto> container){
        return new ReplyingKafkaTemplate(pf, container);

    }

    @Bean
    public ProducerFactory<String, InGetAccountInfo> requestProducerFactory() {
        return new DefaultKafkaProducerFactory<>(producerConfigs());
    }

    @Bean
    public ConsumerFactory<String, AccountInquiryDto> replyConsumerFactory() {
        JsonDeserializer<AccountInquiryDto> jsonDeserializer = new JsonDeserializer<>();
        jsonDeserializer.addTrustedPackages(InGetAccountInfo.class.getPackage().getName());
        jsonDeserializer.addTrustedPackages(AccountInquiryDto.class.getPackage().getName());
        return new DefaultKafkaConsumerFactory<>(consumerConfigs(), new StringDeserializer(),jsonDeserializer);
    }

    @Bean
    public KafkaMessageListenerContainer<String, AccountInquiryDto> replyContainer(ConsumerFactory<String, AccountInquiryDto> cf) {
        ContainerProperties containerProperties = new ContainerProperties(requestReplyTopic);
        return new KafkaMessageListenerContainer<>(cf, containerProperties);
    }



    @Bean
    public KafkaAdmin admin() {
        Map<String, Object> configs = new HashMap<>();
        configs.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
        return new KafkaAdmin(configs);
    }



    @Bean
    public KafkaAsyncService kafkaAsyncService(){
        return new KafkaAsyncService();
    }


}


 

Kafka producer class卡夫卡制作人 class

public AccountInquiryDto getModelResponse(InGetAccountInfo accountInfo) throws Exception{

        LOGGER.info("Received request for request  for account " + accountInfo);

        // create producer record
        ProducerRecord<String, InGetAccountInfo> record = new ProducerRecord<String, InGetAccountInfo>(requestTopic,accountInfo);
        // set reply topic in header
        record.headers().add(new RecordHeader(KafkaHeaders.REPLY_TOPIC, requestReplyTopic.getBytes()));

        // post in kafka topic
        RequestReplyFuture<String, InGetAccountInfo, AccountInquiryDto> sendAndReceive = kafkaTemplate.sendAndReceive(record);

        // confirm if producer produced successfully
        SendResult<String, InGetAccountInfo> sendResult = sendAndReceive.getSendFuture().get();

       // //print all headers
        sendResult.getProducerRecord().headers().forEach(header -> System.out.println(header.key() + ":" + header.value().toString()));

        // get consumer record
        ConsumerRecord<String, AccountInquiryDto> consumerRecord = sendAndReceive.get();

        ObjectMapper mapper = new ObjectMapper();

        AccountInquiryDto modelResponse = mapper.convertValue(
                consumerRecord.value(),
                new TypeReference<AccountInquiryDto>() { });


        LOGGER.info("Returning record for " + modelResponse);

        return modelResponse;

    }

Kafka Consumer Config卡夫卡消费者配置

public class KafkaConfig {

  @Value("${spring.kafka.bootstrap-servers}")
  private String bootstrapServers;

  @Value("${spring.kafka.consumer.group-id}")
  private String groupId;

  @Value("${kafka.topic.acct-info.request}")
  private String requestTopic;

  @Value("${kafka.topic.request-reply.timeout-ms}")
  private Long replyTimeout;

  @Bean
  public Map<String, Object> consumerConfigs() {
    Map<String, Object> props = new HashMap<>();
    props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
    props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
    props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
    props.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);
    return props;
  }

  @Bean
  public Map<String, Object> producerConfigs() {
    Map<String, Object> props = new HashMap<>();
    props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
    props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
    props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
    return props;
  }

  @Bean
  public ConsumerFactory<String, InGetAccountInfo> requestConsumerFactory() {
    JsonDeserializer<InGetAccountInfo> jsonDeserializer = new JsonDeserializer<>();
    jsonDeserializer.addTrustedPackages(InGetAccountInfo.class.getPackage().getName());
    return new DefaultKafkaConsumerFactory<>(consumerConfigs(), new StringDeserializer(),jsonDeserializer);
  }

  @Bean
  public KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<String, InGetAccountInfo>> requestReplyListenerContainerFactory() {
    ConcurrentKafkaListenerContainerFactory<String, InGetAccountInfo> factory = new ConcurrentKafkaListenerContainerFactory<>();
    factory.setConsumerFactory(requestConsumerFactory());
    factory.setConcurrency(3);
    factory.setReplyTemplate(replyTemplate());
    return factory;
  }

  @Bean
  public ProducerFactory<String, AccountInquiryDto> replyProducerFactory() {
    return new DefaultKafkaProducerFactory<>(producerConfigs());
  }

  @Bean
  public KafkaTemplate<String, AccountInquiryDto> replyTemplate() {
    return new KafkaTemplate<>(replyProducerFactory());
  }

  @Bean
  public DepAcctInqConsumerController Controller() {
    return new DepAcctInqConsumerController();
  }
  @Bean

  public KafkaAdmin admin() {
    Map<String, Object> configs = new HashMap<>();
    configs.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
    return new KafkaAdmin(configs);
  }

  @Bean
  public NewTopic requestTopic() {
    Map<String, String> configs = new HashMap<>();
    configs.put("retention.ms", replyTimeout.toString());
    return new NewTopic(requestTopic, 2, (short) 2).configs(configs);
  }


}

Kafka Consumer class卡夫卡消费者 class

  @KafkaListener(topics = "${kafka.topic.acct-info.request}", containerFactory = "requestReplyListenerContainerFactory")
  @SendTo
  public Message<?> listenPartition0(InGetAccountInfo accountInfo,
                                     @Header(KafkaHeaders.REPLY_TOPIC) byte[] replyTo,
                                     @Header(KafkaHeaders.RECEIVED_PARTITION_ID) int id) {

    try {

      LOGGER.info("Received request for partition id = " +  id);
      LOGGER.info("Received request for accountInfo = " +  accountInfo.getAccountNumber());

      AccountInquiryDto accountInfoDto  = getAccountInquiryDto(accountInfo);

      LOGGER.info("Returning accountInfoDto = " +  accountInfoDto.toString());

      return MessageBuilder.withPayload(accountInfoDto)
              .setHeader(KafkaHeaders.TOPIC, replyTo)
              .setHeader(KafkaHeaders.RECEIVED_PARTITION_ID, id)
              .build();


    } catch (Exception e) {
      LOGGER.error(e.toString(),e);
    }

    return null;
  }

I was able to solve the problem by modifying the configuration for the producer adding a variable CLIENT_ID_CONFIG with the following:我能够通过修改生产者的配置添加一个变量 CLIENT_ID_CONFIG 来解决这个问题:

@Bean
public Map<String, Object> producerConfigs() {
    Map<String, Object> props = new HashMap<>();
    props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
    props.put(ProducerConfig.CLIENT_ID_CONFIG, clientId +  "-" + UUID.randomUUID().toString());
    props.put(ProducerConfig.RETRIES_CONFIG,"2");
    props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
    props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);

    return props;
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM