[英]How spy an autowired bean in Spring Tests
I have a simple logging handler beans configuration which I inject into an IntegrationFlow
我有一个简单的日志记录处理程序 bean 配置,我将其注入到
IntegrationFlow
@Configuration
class LogHandlerConfiguration {
private LoggingHandler handler;
@Bean
public MessageHandler kafkaSuccessHandler() {
return getLogger(LoggingHandler.Level.INFO);
}
@Bean(name="kafkaFailureHandler")
public MessageHandler kafkaFailureHandler() {
return getLogger(LoggingHandler.Level.ERROR);
}
private LoggingHandler getLogger(LoggingHandler.Level level) {
handler = new LoggingHandler(level);
handler.setShouldLogFullMessage(Boolean.TRUE);
return handler;
}
}
the integration flow to test要测试的集成流程
@Bean
IntegrationFlow kafkaFailureFlow(ExecutorChannel kafkaErrorChannel, MessageHandler kafkaFailureHandler) {
return IntegrationFlows.from(kafkaErrorChannel)
.transform("payload.failedMessage")
.handle(kafkaFailureHandler)
.get();
}
Here's my test这是我的测试
@SpyBean
MessageHandler kafkaFailureHandler;
@BeforeEach
public void setup() {
MockitoAnnotations.openMocks(KafkaPublishFailureTest.class);
}
@Test
void testFailedKafkaPublish() {
//Dummy message
Map<String, String> map = new HashMap<>();
map.put("key", "value");
// Publish Message
Message<Map<String, String>> message = MessageBuilder.withPayload(map)
.setHeader("X-UPSTREAM-TYPE", "alm")
.setHeader("X-UPSTREAM-INSTANCE", "jira")
.setHeader("X-MESSAGE-KEY", "key-1")
.build();
kafkaGateway.publish(message);
// Failure handler called
Mockito.verify(kafkaFailureHandler, Mockito.timeout(0).atLeastOnce()).handleMessage(
ArgumentMatchers.any(Message.class));
}
We've created a generic Kafka Producer, Consumer configuration to which downsteam apps can attach failure and success handler best suited to their needs.我们创建了一个通用的 Kafka 生产者、消费者配置,下游应用程序可以将最适合其需求的失败和成功处理程序附加到该配置。 I'm not able to verify the the
LoggingHandler
in this case is called atleast once.在这种情况下,我无法验证
LoggingHandler
至少被调用一次。
The failureHandler
gets executed under an ExecturoeChannel
backed by ThreadPoolTaskExecutor
failureHandler
在ThreadPoolTaskExecutor
Executor 支持的ExecturoeChannel
下执行
@Bean
ExecutorChannel kafkaErrorChannel(Executor threadPoolExecutor) {
return MessageChannels.executor("kafkaErrorChannel", threadPoolExecutor).get();
}
failures are handled via retry advice失败通过重试建议处理
@Bean
RequestHandlerRetryAdvice retryAdvice(ExecutorChannel kafkaErrorChannel) {
RequestHandlerRetryAdvice retryAdvice = new RequestHandlerRetryAdvice();
retryAdvice.setRecoveryCallback(new ErrorMessageSendingRecoverer(kafkaErrorChannel));
return retryAdvice;
}
I get this error when I run the test运行测试时出现此错误
java.lang.IllegalStateException: No bean found for definition [SpyDefinition@44dfdd58 name = '', typeToSpy = org.springframework.messaging.MessageHandler, reset = AFTER]
at org.springframework.util.Assert.state(Assert.java:97) ~[spring-core-5.3.4.jar:5.3.4]
at org.springframework.boot.test.mock.mockito.MockitoPostProcessor.inject(MockitoPostProcessor.java:351) ~[spring-boot-test-2.4.3.jar:2.4.3]
Here is what I've tried and ti works:这是我尝试过的并且 ti 有效:
@SpringBootApplication
public class Demo1Application {
public static void main(String[] args) {
SpringApplication.run(Demo1Application.class, args);
}
@Bean
ExecutorChannel kafkaErrorChannel(TaskExecutor taskExecutor) {
return new ExecutorChannel(taskExecutor);
}
@Bean
public MessageHandler kafkaFailureHandler() {
LoggingHandler handler = new LoggingHandler(LoggingHandler.Level.ERROR);
handler.setShouldLogFullMessage(Boolean.TRUE);
return handler;
}
@Bean
IntegrationFlow kafkaFailureFlow(ExecutorChannel kafkaErrorChannel, MessageHandler kafkaFailureHandler) {
return IntegrationFlows.from(kafkaErrorChannel)
.transform("payload.failedMessage")
.handle(kafkaFailureHandler)
.get();
}
}
@SpringBootTest
class Demo1ApplicationTests {
@Autowired
ExecutorChannel kafkaErrorChannel;
@SpyBean
MessageHandler kafkaFailureHandler;
@Test
void testSpyBean() throws InterruptedException {
MessagingException payload = new MessageHandlingException(new GenericMessage<>("test"));
this.kafkaErrorChannel.send(new ErrorMessage(payload));
Thread.sleep(1000);
Mockito.verify(this.kafkaFailureHandler).handleMessage(ArgumentMatchers.any(Message.class));
}
}
Perhaps your problem that you don't include LogHandlerConfiguration
into your @SpringBootTest
configuration.也许您的问题是您没有将
LogHandlerConfiguration
包含到您的@SpringBootTest
配置中。 That's why I asked a simple project to play with.这就是为什么我要求一个简单的项目来玩。 Your code with all those properties are too custom to just copy/paste into my environment...
您具有所有这些属性的代码太自定义了,无法复制/粘贴到我的环境中......
Also pay attention to that Thread.sleep(1000);
还要注意
Thread.sleep(1000);
. . Since your
kafkaErrorChannel
is an ExecutorChannel
, the message consumption happens on a different thread leaving your main testing thread and leading to a failure because of race condition.由于您的
kafkaErrorChannel
是ExecutorChannel
,因此消息消耗发生在离开您的主测试线程并由于竞争条件而导致失败的不同线程上。 It is hard to guess the proper timing, so better to stub a mock method to fulfill some thread barrier like new CountDownLatch(1)
and wait for it in the test.很难猜出正确的时机,所以最好存根模拟方法来满足一些线程屏障,比如
new CountDownLatch(1)
并在测试中等待它。
Out of subject you can also investigate Spring Integration Testing Framework: https://docs.spring.io/spring-integration/docs/current/reference/html/testing.html#test-context超出主题您还可以调查 Spring 集成测试框架: https://docs.spring.io/spring-integration/docs/current/textreference/html/testing.con
So, was hung up over Why not @SpyBean
?所以,挂了为什么不
@SpyBean
? There were two problems有两个问题
@SpyBean
@SpyBean
Here's what Finally worked, using a named bean这是最后的工作,使用命名的bean
@Bean("kafkaFailureHandler")
public MessageHandler kafkaFailureHandler() {
LoggingHandler handler = new LoggingHandler(LoggingHandler.Level.INFO);
handler.setShouldLogFullMessage(Boolean.TRUE);
return handler;
}
and then in tests reducing the max block too然后在测试中也减少最大块
@DirtiesContext
@SpringBootTest(classes = {KafkaHandlerConfiguration.class, SwiftalkKafkaGateway.class})
@SpringIntegrationTest(noAutoStartup = {"kafkaFailureFlow"})
@TestPropertySource(properties = {
"spring.main.banner-mode=off",
"logging.level.root=INFO",
"logging.level.org.springframework=INFO",
"logging.level.org.springframework.integration=DEBUG",
"spring.kafka.producer.properties.max.block.ms=50",
"spring.kafka.producer.bootstrap-servers=localhost:9999",
"spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer",
"spring.kafka.producer.value-serializer=org.springframework.kafka.support.serializer.JsonSerializer",
})
public class KafkaPublishFailureTest {
private static final Logger log = LogManager.getLogger(KafkaPublishFailureTest.class);
@Autowired
SwiftalkKafkaGateway kafkaGateway;
@SpyBean(name = "kafkaFailureHandler")
MessageHandler kafkaFailureHandler;
@Test
@SuppressWarnings("all")
void testFailedKafkaPublish() throws InterruptedException {
//Dummy message
Map<String, String> map = new HashMap<>();
map.put("key", "value");
// Publish Message
Message<Map<String, String>> message = MessageBuilder.withPayload(map)
.setHeader("X-UPSTREAM-TYPE", "alm")
.setHeader("X-UPSTREAM-INSTANCE", "jira")
.setHeader("X-MESSAGE-KEY", "key-1")
.build();
kafkaGateway.publish(message);
verify(this.kafkaFailureHandler, timeout(500)).handleMessage(any(Message.class));
}
}
notice the spring.kafka.producer.properties.max.block.ms=50
and verify(this.kafkaFailureHandler, timeout(500)).handleMessage(any(Message.class));
注意
spring.kafka.producer.properties.max.block.ms=50
和verify(this.kafkaFailureHandler, timeout(500)).handleMessage(any(Message.class));
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.