[英]How spy an autowired bean in Spring Tests
我有一個簡單的日志記錄處理程序 bean 配置,我將其注入到IntegrationFlow
@Configuration
class LogHandlerConfiguration {
private LoggingHandler handler;
@Bean
public MessageHandler kafkaSuccessHandler() {
return getLogger(LoggingHandler.Level.INFO);
}
@Bean(name="kafkaFailureHandler")
public MessageHandler kafkaFailureHandler() {
return getLogger(LoggingHandler.Level.ERROR);
}
private LoggingHandler getLogger(LoggingHandler.Level level) {
handler = new LoggingHandler(level);
handler.setShouldLogFullMessage(Boolean.TRUE);
return handler;
}
}
要測試的集成流程
@Bean
IntegrationFlow kafkaFailureFlow(ExecutorChannel kafkaErrorChannel, MessageHandler kafkaFailureHandler) {
return IntegrationFlows.from(kafkaErrorChannel)
.transform("payload.failedMessage")
.handle(kafkaFailureHandler)
.get();
}
這是我的測試
@SpyBean
MessageHandler kafkaFailureHandler;
@BeforeEach
public void setup() {
MockitoAnnotations.openMocks(KafkaPublishFailureTest.class);
}
@Test
void testFailedKafkaPublish() {
//Dummy message
Map<String, String> map = new HashMap<>();
map.put("key", "value");
// Publish Message
Message<Map<String, String>> message = MessageBuilder.withPayload(map)
.setHeader("X-UPSTREAM-TYPE", "alm")
.setHeader("X-UPSTREAM-INSTANCE", "jira")
.setHeader("X-MESSAGE-KEY", "key-1")
.build();
kafkaGateway.publish(message);
// Failure handler called
Mockito.verify(kafkaFailureHandler, Mockito.timeout(0).atLeastOnce()).handleMessage(
ArgumentMatchers.any(Message.class));
}
我們創建了一個通用的 Kafka 生產者、消費者配置,下游應用程序可以將最適合其需求的失敗和成功處理程序附加到該配置。 在這種情況下,我無法驗證LoggingHandler
至少被調用一次。
failureHandler
在ThreadPoolTaskExecutor
Executor 支持的ExecturoeChannel
下執行
@Bean
ExecutorChannel kafkaErrorChannel(Executor threadPoolExecutor) {
return MessageChannels.executor("kafkaErrorChannel", threadPoolExecutor).get();
}
失敗通過重試建議處理
@Bean
RequestHandlerRetryAdvice retryAdvice(ExecutorChannel kafkaErrorChannel) {
RequestHandlerRetryAdvice retryAdvice = new RequestHandlerRetryAdvice();
retryAdvice.setRecoveryCallback(new ErrorMessageSendingRecoverer(kafkaErrorChannel));
return retryAdvice;
}
運行測試時出現此錯誤
java.lang.IllegalStateException: No bean found for definition [SpyDefinition@44dfdd58 name = '', typeToSpy = org.springframework.messaging.MessageHandler, reset = AFTER]
at org.springframework.util.Assert.state(Assert.java:97) ~[spring-core-5.3.4.jar:5.3.4]
at org.springframework.boot.test.mock.mockito.MockitoPostProcessor.inject(MockitoPostProcessor.java:351) ~[spring-boot-test-2.4.3.jar:2.4.3]
這是我嘗試過的並且 ti 有效:
@SpringBootApplication
public class Demo1Application {
public static void main(String[] args) {
SpringApplication.run(Demo1Application.class, args);
}
@Bean
ExecutorChannel kafkaErrorChannel(TaskExecutor taskExecutor) {
return new ExecutorChannel(taskExecutor);
}
@Bean
public MessageHandler kafkaFailureHandler() {
LoggingHandler handler = new LoggingHandler(LoggingHandler.Level.ERROR);
handler.setShouldLogFullMessage(Boolean.TRUE);
return handler;
}
@Bean
IntegrationFlow kafkaFailureFlow(ExecutorChannel kafkaErrorChannel, MessageHandler kafkaFailureHandler) {
return IntegrationFlows.from(kafkaErrorChannel)
.transform("payload.failedMessage")
.handle(kafkaFailureHandler)
.get();
}
}
@SpringBootTest
class Demo1ApplicationTests {
@Autowired
ExecutorChannel kafkaErrorChannel;
@SpyBean
MessageHandler kafkaFailureHandler;
@Test
void testSpyBean() throws InterruptedException {
MessagingException payload = new MessageHandlingException(new GenericMessage<>("test"));
this.kafkaErrorChannel.send(new ErrorMessage(payload));
Thread.sleep(1000);
Mockito.verify(this.kafkaFailureHandler).handleMessage(ArgumentMatchers.any(Message.class));
}
}
也許您的問題是您沒有將LogHandlerConfiguration
包含到您的@SpringBootTest
配置中。 這就是為什么我要求一個簡單的項目來玩。 您具有所有這些屬性的代碼太自定義了,無法復制/粘貼到我的環境中......
還要注意Thread.sleep(1000);
. 由於您的kafkaErrorChannel
是ExecutorChannel
,因此消息消耗發生在離開您的主測試線程並由於競爭條件而導致失敗的不同線程上。 很難猜出正確的時機,所以最好存根模擬方法來滿足一些線程屏障,比如new CountDownLatch(1)
並在測試中等待它。
超出主題您還可以調查 Spring 集成測試框架: https://docs.spring.io/spring-integration/docs/current/textreference/html/testing.con
所以,掛了為什么不@SpyBean
? 有兩個問題
@SpyBean
這是最后的工作,使用命名的bean
@Bean("kafkaFailureHandler")
public MessageHandler kafkaFailureHandler() {
LoggingHandler handler = new LoggingHandler(LoggingHandler.Level.INFO);
handler.setShouldLogFullMessage(Boolean.TRUE);
return handler;
}
然后在測試中也減少最大塊
@DirtiesContext
@SpringBootTest(classes = {KafkaHandlerConfiguration.class, SwiftalkKafkaGateway.class})
@SpringIntegrationTest(noAutoStartup = {"kafkaFailureFlow"})
@TestPropertySource(properties = {
"spring.main.banner-mode=off",
"logging.level.root=INFO",
"logging.level.org.springframework=INFO",
"logging.level.org.springframework.integration=DEBUG",
"spring.kafka.producer.properties.max.block.ms=50",
"spring.kafka.producer.bootstrap-servers=localhost:9999",
"spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer",
"spring.kafka.producer.value-serializer=org.springframework.kafka.support.serializer.JsonSerializer",
})
public class KafkaPublishFailureTest {
private static final Logger log = LogManager.getLogger(KafkaPublishFailureTest.class);
@Autowired
SwiftalkKafkaGateway kafkaGateway;
@SpyBean(name = "kafkaFailureHandler")
MessageHandler kafkaFailureHandler;
@Test
@SuppressWarnings("all")
void testFailedKafkaPublish() throws InterruptedException {
//Dummy message
Map<String, String> map = new HashMap<>();
map.put("key", "value");
// Publish Message
Message<Map<String, String>> message = MessageBuilder.withPayload(map)
.setHeader("X-UPSTREAM-TYPE", "alm")
.setHeader("X-UPSTREAM-INSTANCE", "jira")
.setHeader("X-MESSAGE-KEY", "key-1")
.build();
kafkaGateway.publish(message);
verify(this.kafkaFailureHandler, timeout(500)).handleMessage(any(Message.class));
}
}
注意spring.kafka.producer.properties.max.block.ms=50
和verify(this.kafkaFailureHandler, timeout(500)).handleMessage(any(Message.class));
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.