簡體   English   中英

Spring Stream kafka Binder 測試自定義標頭

[英]Spring Stream kafka Binder Test Custom Headers

我試圖弄清楚如何在 Spring Cloud Stream 中使用的 Spring 中使用的 Spring Message<?>中包含自定義標頭。 My goal is to include some custom header data that would be added on in one producer (function) class, passed to kafka and then consumed by another class in a different service (with the customer header data).

我覺得我錯過了一些東西,因為我似乎可以使用TestChannelBinder例如讓它工作

import org.springframework.messaging.Message;
import org.springframework.stereotype.Component;

import java.util.function.Function;

@Component
@Slf4j
public class BaseStream implements Function<Message<String>, String> {
    @Override
    public String apply(Message<String> transactionMessage) {
        log.debug("Converted Message: {} ", transactionMessage);
        return transactionMessage.getPayload();
    }

}

使用測試粘合劑測試 class:


import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.cloud.stream.binder.test.InputDestination;
import org.springframework.cloud.stream.binder.test.OutputDestination;
import org.springframework.cloud.stream.binder.test.TestChannelBinderConfiguration;
import org.springframework.context.annotation.Import;
import org.springframework.integration.support.MessageBuilder;
import org.springframework.kafka.test.context.EmbeddedKafka;
import org.springframework.test.context.TestPropertySource;


@SpringBootTest
@TestPropertySource("classpath:testStream.properties")
@Import(TestChannelBinderConfiguration.class)
public class TestForStream {


    @Autowired
    InputDestination inputDestination;
    @Autowired
    OutputDestination outputDestination;

    @Test
    void contextLoads() {
        inputDestination.send(MessageBuilder
                .withPayload("Test Payload")
                .setHeader("customHeader", "headerSpecificData")
                .build());
    }
}

測試流屬性

spring.cloud.function.definition=baseStream
spring.cloud.stream.bindings.baseStream-in-0.destination=test-in
spring.cloud.stream.bindings.baseStream-out-0.destination=test-out
spring.cloud.stream.bindings.baseStream-in-0.group=test-group-base

運行時記錄:

Converted Message: GenericMessage [payload=Test Payload, headers={id=5c6d1082-c084-0b25-4afc-b5d97bf537f9, customHeader=headerSpecificData, contentType=application/json, timestamp=1639398696800, target-protocol=kafka}]

這就是我想要做的。 但是當我嘗試為 kafka bider 測試它時,它似乎在有效負載中包含Message<String> object 作為 JSON 字符串,我認為它會被解析為 ZC1C425268E68385D1ABZ507 BaseStream的請求輸入。

只是想知道是否有人可能會看到我的測試出了什么問題,因為我已經嘗試了各種方法來讓它工作,並且看到它與測試活頁夾一起工作,我認為它適用於 Kafka 活頁夾。

測試 Class 以進行 Kafka Binder 測試:

import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.common.serialization.StringSerializer;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.integration.support.MessageBuilder;
import org.springframework.kafka.core.DefaultKafkaProducerFactory;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.core.ProducerFactory;
import org.springframework.kafka.support.serializer.JsonSerializer;
import org.springframework.kafka.test.EmbeddedKafkaBroker;
import org.springframework.kafka.test.context.EmbeddedKafka;
import org.springframework.kafka.test.utils.KafkaTestUtils;
import org.springframework.test.context.TestPropertySource;

import java.util.HashMap;
import java.util.Map;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;


@EmbeddedKafka(partitions = 1, brokerProperties = { "listeners=PLAINTEXT://localhost:9092", "port=9092"})
@SpringBootTest
@TestPropertySource("classpath:testStream.properties")
public class TestForStream {

    public static CountDownLatch latch = new CountDownLatch(1);
    @Autowired
    public EmbeddedKafkaBroker broker;

    @Test
    void contextLoads() {
        sleep(5);//Included this as it takes some time to init>

        sendMessage("test-in", MessageBuilder
                .withPayload("Test Payload")
                .setHeader("customHeader", "headerSpecificData")
                .build());


    }

    public <T> ProducerFactory<String, T> createProducerFactory() {
        Map<String, Object> configs = new HashMap<>(KafkaTestUtils.producerProps(broker));
        configs.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        configs.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
        //Is JsonSerializer correct for a message?
        return new DefaultKafkaProducerFactory<>(configs);
    }

    public <T> void sendMessage(String topic, T listObj) {
        try {
            KafkaTemplate<String, T> kafkaTemplate = new KafkaTemplate<>(createProducerFactory());
            kafkaTemplate.send(new ProducerRecord<>(topic, listObj));
        }catch (Exception e){
            e.printStackTrace();
        }
    }

    public void sleep(long time){
        try {
            latch.await(time, TimeUnit.SECONDS);
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }

}


消息的 kafka binder 測試日志:

Converted Message: GenericMessage [payload={"payload":"Test Payload","headers":{"customHeader":"headerSpecificData","id":"d540a3ca-28db-b137-fc86-c25cc4b7eb8b","timestamp":1639399810476}}, headers={deliveryAttempt=1, kafka_timestampType=CREATE_TIME, kafka_receivedTopic=test-in, target-protocol=kafka, kafka_offset=0, scst_nativeHeadersPresent=true, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer@79580279, id=1cf2d382-df29-2672-4180-07da99e58244, kafka_receivedPartitionId=0, kafka_receivedTimestamp=1639399810526, contentType=application/json, __TypeId__=[B@24c79350, kafka_groupId=test-group-base, timestamp=1639399810651}]

因此,這里消息已包含在有效負載中,並且 kafka 標頭按預期包含在標頭中。

我試過spring.cloud.stream.kafka.binder.headersheaderMode看看他們是否會改變任何東西,但無濟於事。

編輯:

使用 springCloudVersion = 2020.0.3

我正在使用:

    public <T> void sendMessage(String topic, T listObj) {
        try {
            KafkaTemplate<String, T> kafkaTemplate = new KafkaTemplate<>(createProducerFactory());
            kafkaTemplate.send(new ProducerRecord<>(topic, listObj));
        }catch (Exception e){
            e.printStackTrace();
        }
    }

發送將消息作為值的消息。

我應該一直使用的:

    public void sendMessage(String topic, Message<?> listObj) {
        try {
            KafkaTemplate<String, Message<?>> kafkaTemplate = new KafkaTemplate<>(createProducerFactory());
            kafkaTemplate.setDefaultTopic(topic);
            kafkaTemplate.send(listObj);
        }catch (Exception e){
            e.printStackTrace();
        }
    }

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM