[英]Spring Cloud Stream 2.0 ClassNotFoundException for custom headers
升級到 Spring Cloud Stream Elmhurst.RELEASE 后,@StreamListener 方法在使用帶有自定義標頭對象 HttpHeaders 的消息時會拋出錯誤。
發布者方法:
public static <T> Message<T> getMessage(T payload, HttpHeaders httpHeaders) {
MessageHeaderAccessor accessor = new MessageHeaderAccessor();
accessor.copyHeaders((Map)httpHeaders.entrySet().stream().filter((x) -> {
return !((String)x.getKey()).equalsIgnoreCase("correlationid");
}).collect(Collectors.toMap(Entry::getKey, Entry::getValue)));
accessor.setHeader("httpHeaders", httpHeaders);
return MessageBuilder.withPayload(payload).setHeaders(accessor).build();
}
消費者方法:
@StreamListener(INPUT_CHANNEL)
public void handleMessage(@Payload CustomObject request, @Header(name = "httpHeaders") HttpHeaders httpHeaders) {
//handling
}
消費消息后的錯誤堆棧跟蹤:
2018-08-28 00:50:08.854 ERROR [deposits,,,,] 28680 --- [container-0-C-1] o.s.k.support.DefaultKafkaHeaderMapper : Could not load class for header: httpHeaders
java.lang.ClassNotFoundException: org.springframework.http.HttpHeaders$$EnhancerBySpringCGLIB$$fd346b3a
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.springframework.util.ClassUtils.forName(ClassUtils.java:274)
at org.springframework.kafka.support.DefaultKafkaHeaderMapper.lambda$toHeaders$1(DefaultKafkaHeaderMapper.java:225)
at java.lang.Iterable.forEach(Iterable.java:75)
at org.springframework.kafka.support.DefaultKafkaHeaderMapper.toHeaders(DefaultKafkaHeaderMapper.java:216)
at org.springframework.kafka.support.converter.MessagingMessageConverter.toMessage(MessagingMessageConverter.java:106)
at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.toMessagingMessage(MessagingMessageListenerAdapter.java:229)
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter$IntegrationRecordMessageListener.onMessage(KafkaMessageDrivenChannelAdapter.java:375)
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter$IntegrationRecordMessageListener.onMessage(KafkaMessageDrivenChannelAdapter.java:364)
at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter.lambda$onMessage$0(RetryingMessageListenerAdapter.java:120)
at org.springframework.retry.support.RetryTemplate.doExecute(RetryTemplate.java:287)
at org.springframework.retry.support.RetryTemplate.execute(RetryTemplate.java:211)
at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter.onMessage(RetryingMessageListenerAdapter.java:114)
at org.springframework.kafka.listener.adapter.RetryingMessageListenerAdapter.onMessage(RetryingMessageListenerAdapter.java:40)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:1071)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeWithRecords(KafkaMessageListenerContainer.java:1051)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:998)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:866)
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:724)
at java.util.concurrent.Executors$RunnableAdapter.call$$$capture(Executors.java:511)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java)
at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
at java.util.concurrent.FutureTask.run(FutureTask.java)
at java.lang.Thread.run(Thread.java:748)
這很可能是由於 Web 和 Actuator 是自 2.0 以來的可選依賴項 - https://docs.spring.io/spring-cloud-stream/docs/Fishtown.M1/reference/htmlsingle/#spring-cloud-stream-preface -actuator-web-dependencies
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.