简体   繁体   中英

How to Create Integration Test for Spring Kafka Listener

I have one MicroService who sends to another microservice message that should consume.

So, the kafka configs works, and everything works, but I need to create one intregration test for this code, and I have no idea how.

My KafkaConsumer.Class anothed with component anottation:

private static final Logger logger = LoggerFactory.getLogger(KafkaReactionConsumerMessageComponent.class);

private final ReactionsService reactionsService;

public KafkaReactionConsumerMessageComponent(ReactionsService reactionsService) {
    this.reactionsService = reactionsService;
}

   @KafkaListener(topics = "reaction-topic", clientIdPrefix = "string", groupId = "magpie-trending")
   public void consumingReactionMessages(ConsumerRecord<String, String> cr,
                                      @Payload String payload){
    logger.info("[JSON] received Payload: {}", payload);

    try {
        ObjectMapper mapper = new ObjectMapper();
        ReactionMessage message = mapper.readValue(payload, ReactionMessage.class);
        if(StringUtils.equals("unloved", message.getReactionType())) {
            reactionsService.deleteReactionsByUserIdAndPostId(message.getPost().getPostId(), message.getUser().getUserId());
            logger.info("Deleted reactions from database with postId: {} and userId: {}", message.getPost().getPostId(), message.getUser().getUserId());
        } else {
            List<Reaction> reactions = creatReactions(message).stream()
                    .map(reactionsService::insertReaction).collect(Collectors.toList());
            logger.info("Added reactions to database: {}", reactions);
        }
    } catch (Exception e){
        logger.error("Cannot Deserialize payload to ReactionMessage");
    }

}

My Integration Test is

private static final String TOPIC = "reaction-topic";
  private final Logger logger = LoggerFactory.getLogger(KafkaDeletePostConsumerMessageComponent.class);

  private final KafkaReactionConsumerMessageComponent kafkaReactionConsumerMessageComponent;
  private final EmbeddedKafkaBroker embeddedKafkaBroker;

  private Consumer<String, String> consumer;

  @SuppressWarnings("SpringJavaAutowiringInspection")
  @Autowired
  public KafkaReactionMessageConsumerTest(KafkaReactionConsumerMessageComponent kafkaReactionConsumerMessageComponent,
                                        EmbeddedKafkaBroker embeddedKafkaBroker) {
    this.kafkaReactionConsumerMessageComponent = kafkaReactionConsumerMessageComponent;
    this.embeddedKafkaBroker = embeddedKafkaBroker;
}

@BeforeEach
public void setUp() {
    Map<String, Object> configs = new HashMap<>(KafkaTestUtils.consumerProps("consumer", "true", embeddedKafkaBroker));
    consumer = new DefaultKafkaConsumerFactory<>(configs, new StringDeserializer(), new StringDeserializer()).createConsumer();
    consumer.subscribe(Collections.singleton(TOPIC));
    consumer.poll(Duration.ZERO);
}

@AfterEach
public void tearDown() {
    consumer.close();
}

@Test
public void shoudlConsumeAndInsertInDatabaseReactionDomain() {
    ReactionMessage reactionMessage = new ReactionMessage(new PostMessage("1", Set.of("a", "b", "c")),
            new UserMessage("2"), LocalDateTime.now().toString(), "loved");

    Map<String, Object> configs = new HashMap<>(KafkaTestUtils.producerProps(embeddedKafkaBroker));
    Producer<String, String> producer = new DefaultKafkaProducerFactory<>(configs, new StringSerializer(), new StringSerializer()).createProducer();


    producer.send(new ProducerRecord<>(TOPIC, "1", reactionMessage.toString()));
    producer.flush();

    assertEquals(3, mongoTemplate.getCollection("reactions").countDocuments());
}

The AbstractClass:

 @ExtendWith(SpringExtension.class)
  @SpringBootTest
  @AutoConfigureMockMvc
  @EmbeddedKafka(brokerProperties={
        "log.dir=out/embedded-kafka"
  })
  public abstract class AbstractMongoEmbeddedTest {

    @Autowired
    private static MongodExecutable mongodExecutable;

    @Autowired
    protected MongoTemplate mongoTemplate;

    @BeforeEach
    private void dropPostCollection(){
       mongoTemplate.dropCollection(Reaction.class);
    }

Since you are using an embedded Kafka broker you could simply produce/consume the desired topic(s) from within your integration test.

Consuming

Consuming can be done via a simple jUnit rule. A rule serving this purpose can be found here . Feel free to use it.

You can use it like this to assert consumed messages:

assertThat(kafkaConsumeMessagesRule.pollMessage()).isEqualTo("your-expected-message");

Producing

For producing messages you can simply wire a org.springframework.kafka.core.KafkaTemplate in your integration test and send messages to a given topic.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM