簡體   English   中英

如何在Spring XD中獲取Kafka接收器模塊的確認時,在Kafka源模塊上手動提交偏移量?

[英]How to manually commit offsets on Kafka source module on getting acknowledgment from Kafka sink module in Spring XD?

在XD流中,消息通過源模塊從Kafka主題消耗,然后發送到接收器Kafka模塊。 開發自定義源和接收Kafka模塊的原因是我只想在成功發送消息時從下游接收器模塊獲得確認時才更新源模塊的偏移量。

我正在使用Spring Integration Kafka 2.0.1.RELEASE和Spring Kafka 1.0.3.RELEASE與Kafka 0.10.0.0環境中的主題。 我嘗試過以下方法:

源模塊配置:

@Configuration
public class ModuleConfiguration {

    @Value("${topic}")
    private String topic;

    @Value("${brokerList}")
    private String brokerAddress;

    @Bean
    public SubscribableChannel output() {
        DirectChannel output = new DirectChannel();
        return output;
    }

    @Autowired
    TopicPartitionInitialOffset topicPartition;

    @Bean
    public TopicPartitionInitialOffset topicPartition(){
        return new TopicPartitionInitialOffset(this.topic, 0, (long) 0);    
    }

    @Bean
    public KafkaMessageListenerContainer<String, String> container() throws Exception {
        ContainerProperties containerProps = new ContainerProperties(topicPartition);
        containerProps.setAckMode(AckMode.MANUAL);
        KafkaMessageListenerContainer<String, String> kafkaMessageListenerContainer = new KafkaMessageListenerContainer<>(consumerFactory(),containerProps);
        return kafkaMessageListenerContainer;
    }
    @Bean
    public ConsumerFactory<String, String> consumerFactory() {
        Map<String, Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, this.brokerAddress);
        props.put(ConsumerConfig.GROUP_ID_CONFIG, "test-consumer-group");
        props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
        props.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, 15000);
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        DefaultKafkaConsumerFactory<String,String> consumerFactory =  new DefaultKafkaConsumerFactory<>(props);
        return consumerFactory;
    }
}

源模塊:InboundKafkaMessageDrivenAdapter

@MessageEndpoint
@Import(ModuleConfiguration.class)
public class InboundKafkaMessageDrivenAdapter {

    @Autowired
    KafkaMessageListenerContainer<String, String> container;

    @Autowired
    SubscribableChannel output;

    @Bean
    public KafkaMessageDrivenChannelAdapter<String, String> adapter(KafkaMessageListenerContainer<String, String> container) {
        KafkaMessageDrivenChannelAdapter<String, String> kafkaMessageDrivenChannelAdapter = new KafkaMessageDrivenChannelAdapter<>(container);
        kafkaMessageDrivenChannelAdapter.setOutputChannel(output);
        return kafkaMessageDrivenChannelAdapter;
    }
}

接收器模塊:配置

@Configuration
@EnableIntegration
public class ModuleConfiguration {

    @Value("${topic}")
    private String topic;

    @Value("${brokerList}")
    private String brokerAddress;

    @Bean
    public KafkaProducerMessageHandler<String,String> handler() throws Exception {
        KafkaProducerMessageHandler<String, String> handler = new KafkaProducerMessageHandler<>(kafkaTemplate());
        handler.setTopicExpression(new LiteralExpression(this.topic));
        return handler;
    }

    @Bean
    public SubscribableChannel input() {
        return new DirectChannel();
    }

    @Bean
    public KafkaTemplate<String, String> kafkaTemplate() {
        return new KafkaTemplate<>(producerFactory());
    }

    @Bean
    public ProducerFactory<String, String> producerFactory() {
        Map<String, Object> props = new HashMap<>();
        props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, this.brokerAddress);
        props.put(ProducerConfig.RETRIES_CONFIG, 0);
        props.put(ProducerConfig.BATCH_SIZE_CONFIG, 16384);
        props.put(ProducerConfig.LINGER_MS_CONFIG, 1);
        props.put(ProducerConfig.BUFFER_MEMORY_CONFIG, 33554432);
        props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        return new DefaultKafkaProducerFactory<>(props);
    }
}

接收器模塊:SinkActivator

@Import(ModuleConfiguration.class)
@MessageEndpoint
public class SinkActivator {

    @Autowired
    KafkaProducerMessageHandler<String,String> handler;

    @Autowired
    SubscribableChannel input;

    @ServiceActivator(inputChannel = "input")
    public void sendMessage(Message<?> msg) throws Exception{
            Acknowledgment acknowledgment = msg.getHeaders().get(KafkaHeaders.ACKNOWLEDGMENT, Acknowledgment.class);
            handler.handleMessage(msg);
            acknowledgment.acknowledge();
            }
}

源成功接收消息並將它們發送到接收器,但是當我嘗試在接收器中獲取確認時:

確認確認= msg.getHeaders()。get(KafkaHeaders.ACKNOWLEDGMENT,Acknowledgment.class);

拋出以下異常:

引起:java.lang.IllegalArgumentException:為標題'kafka_acknowledgment'指定的類型不正確。 預期[interface org.springframework.kafka.support.Acknowledgment]但實際類型是[class org.springframework.kafka.listener.KafkaMessageListenerContainer $ ListenerConsumer $ ConsumerAcknowledgment]

在spring-integration-kafka-2.0.1.RELEASE的源代碼中,當AckMode = MANUAL時,類KafkaMessageListenerContainer將kafka_acknowledgment標頭添加到消息中,但該類型是ConsumerAcknowldgment的內部靜態類。

那么如何從源模塊中獲取來自源模塊的確認信息?

除非您使用本地傳輸,否則您不能這樣做, Acknowledgment是一個“實時”對象,不能通過線路發送到另一個模塊。

如果您使用本地傳輸它將工作,但您將遇到類加載器問題,因為每個模塊在其自己的類加載器中運行,並且Acknowledgment接口是類的不同實例。

您必須將spring-integration-kafka和spring-kafka移動到xd / lib文件夾,以便從通用類加載器加載類。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM