繁体   English   中英

如何在Spring XD中获取Kafka接收器模块的确认时,在Kafka源模块上手动提交偏移量?

[英]How to manually commit offsets on Kafka source module on getting acknowledgment from Kafka sink module in Spring XD?

在XD流中,消息通过源模块从Kafka主题消耗,然后发送到接收器Kafka模块。 开发自定义源和接收Kafka模块的原因是我只想在成功发送消息时从下游接收器模块获得确认时才更新源模块的偏移量。

我正在使用Spring Integration Kafka 2.0.1.RELEASE和Spring Kafka 1.0.3.RELEASE与Kafka 0.10.0.0环境中的主题。 我尝试过以下方法:

源模块配置:

@Configuration
public class ModuleConfiguration {

    @Value("${topic}")
    private String topic;

    @Value("${brokerList}")
    private String brokerAddress;

    @Bean
    public SubscribableChannel output() {
        DirectChannel output = new DirectChannel();
        return output;
    }

    @Autowired
    TopicPartitionInitialOffset topicPartition;

    @Bean
    public TopicPartitionInitialOffset topicPartition(){
        return new TopicPartitionInitialOffset(this.topic, 0, (long) 0);    
    }

    @Bean
    public KafkaMessageListenerContainer<String, String> container() throws Exception {
        ContainerProperties containerProps = new ContainerProperties(topicPartition);
        containerProps.setAckMode(AckMode.MANUAL);
        KafkaMessageListenerContainer<String, String> kafkaMessageListenerContainer = new KafkaMessageListenerContainer<>(consumerFactory(),containerProps);
        return kafkaMessageListenerContainer;
    }
    @Bean
    public ConsumerFactory<String, String> consumerFactory() {
        Map<String, Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, this.brokerAddress);
        props.put(ConsumerConfig.GROUP_ID_CONFIG, "test-consumer-group");
        props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
        props.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, 15000);
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        DefaultKafkaConsumerFactory<String,String> consumerFactory =  new DefaultKafkaConsumerFactory<>(props);
        return consumerFactory;
    }
}

源模块:InboundKafkaMessageDrivenAdapter

@MessageEndpoint
@Import(ModuleConfiguration.class)
public class InboundKafkaMessageDrivenAdapter {

    @Autowired
    KafkaMessageListenerContainer<String, String> container;

    @Autowired
    SubscribableChannel output;

    @Bean
    public KafkaMessageDrivenChannelAdapter<String, String> adapter(KafkaMessageListenerContainer<String, String> container) {
        KafkaMessageDrivenChannelAdapter<String, String> kafkaMessageDrivenChannelAdapter = new KafkaMessageDrivenChannelAdapter<>(container);
        kafkaMessageDrivenChannelAdapter.setOutputChannel(output);
        return kafkaMessageDrivenChannelAdapter;
    }
}

接收器模块:配置

@Configuration
@EnableIntegration
public class ModuleConfiguration {

    @Value("${topic}")
    private String topic;

    @Value("${brokerList}")
    private String brokerAddress;

    @Bean
    public KafkaProducerMessageHandler<String,String> handler() throws Exception {
        KafkaProducerMessageHandler<String, String> handler = new KafkaProducerMessageHandler<>(kafkaTemplate());
        handler.setTopicExpression(new LiteralExpression(this.topic));
        return handler;
    }

    @Bean
    public SubscribableChannel input() {
        return new DirectChannel();
    }

    @Bean
    public KafkaTemplate<String, String> kafkaTemplate() {
        return new KafkaTemplate<>(producerFactory());
    }

    @Bean
    public ProducerFactory<String, String> producerFactory() {
        Map<String, Object> props = new HashMap<>();
        props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, this.brokerAddress);
        props.put(ProducerConfig.RETRIES_CONFIG, 0);
        props.put(ProducerConfig.BATCH_SIZE_CONFIG, 16384);
        props.put(ProducerConfig.LINGER_MS_CONFIG, 1);
        props.put(ProducerConfig.BUFFER_MEMORY_CONFIG, 33554432);
        props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        return new DefaultKafkaProducerFactory<>(props);
    }
}

接收器模块:SinkActivator

@Import(ModuleConfiguration.class)
@MessageEndpoint
public class SinkActivator {

    @Autowired
    KafkaProducerMessageHandler<String,String> handler;

    @Autowired
    SubscribableChannel input;

    @ServiceActivator(inputChannel = "input")
    public void sendMessage(Message<?> msg) throws Exception{
            Acknowledgment acknowledgment = msg.getHeaders().get(KafkaHeaders.ACKNOWLEDGMENT, Acknowledgment.class);
            handler.handleMessage(msg);
            acknowledgment.acknowledge();
            }
}

源成功接收消息并将它们发送到接收器,但是当我尝试在接收器中获取确认时:

确认确认= msg.getHeaders()。get(KafkaHeaders.ACKNOWLEDGMENT,Acknowledgment.class);

抛出以下异常:

引起:java.lang.IllegalArgumentException:为标题'kafka_acknowledgment'指定的类型不正确。 预期[interface org.springframework.kafka.support.Acknowledgment]但实际类型是[class org.springframework.kafka.listener.KafkaMessageListenerContainer $ ListenerConsumer $ ConsumerAcknowledgment]

在spring-integration-kafka-2.0.1.RELEASE的源代码中,当AckMode = MANUAL时,类KafkaMessageListenerContainer将kafka_acknowledgment标头添加到消息中,但该类型是ConsumerAcknowldgment的内部静态类。

那么如何从源模块中获取来自源模块的确认信息?

除非您使用本地传输,否则您不能这样做, Acknowledgment是一个“实时”对象,不能通过线路发送到另一个模块。

如果您使用本地传输它将工作,但您将遇到类加载器问题,因为每个模块在其自己的类加载器中运行,并且Acknowledgment接口是类的不同实例。

您必须将spring-integration-kafka和spring-kafka移动到xd / lib文件夹,以便从通用类加载器加载类。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM