[英]Spring-Cloud-Stream-Kafka-Binder functional style ignores custom De/Serializer and/or useNativeEncoding?
We're just upgrading to the 3.0.0-Release of Spring-Cloud-Stream and encounter the following problems:我们刚刚升级到Spring-Cloud-Stream 3.0.0-Release,遇到以下问题:
When using the functional style like this:当使用这样的功能样式时:
public class EventProcessor {
private final PriceValidator priceValidator;
@Bean
public Function<Flux<EnrichedValidationRequest>, Flux<ValidationResult>> validate() {
return enrichedValidationRequestFlux -> enrichedValidationRequestFlux
.map(ProcessingContext::new)
.flatMap(priceValidator::validateAndMap);
}
}
The application.yaml looks like this: application.yaml 看起来像这样:
spring.cloud.stream:
default-binder: kafka
kafka:
binder:
brokers: ${kafka.broker.prod}
auto-create-topics: false
function.definition: validate
# INPUT: enrichedValidationRequests
spring.cloud.stream.bindings.validate-in-0:
destination: ${kafka.topic.${spring.application.name}.input.enrichedValidationRequests}
group: ${spring.application.name}.${STAGE:NOT_SET}
consumer:
useNativeDecoding: true
spring.cloud.stream.kafka.bindings.validate-in-0:
consumer:
configuration:
key.deserializer: org.apache.kafka.common.serialization.StringDeserializer
value.deserializer: de.pricevalidator.deserializer.EnrichedValidationRequestDeserializer
# OUTPUT: validationResults
spring.cloud.stream.bindings.validate-out-0:
destination: validationResultsTmp
producer:
useNativeEncoding: true
spring.cloud.stream.kafka.bindings.validate-out-0:
producer:
compression.type: lz4
messageKeyExpression: payload.offerKey
configuration:
key.serializer: org.apache.kafka.common.serialization.StringSerializer
value.serializer: de.pricevalidator.serializer.ValidationResultSerializer
It seems like the serialization is done twice - when we intercept the messages that get produced in the kafka topic, the consumer would just display them as JSON (strings), but now it's an unreadable byte[].似乎序列化完成了两次——当我们拦截在 kafka 主题中生成的消息时,消费者只会将它们显示为 JSON(字符串),但现在它是一个不可读的字节 []。 Also, the downstream consumers in production can't deserialize the messages anymore.此外,生产中的下游消费者不能再对消息进行反序列化。 Strangely enough, the deserialization of the input messages seems to work just fine, no matter what we put into the consumer properties (either on binder or on default kafka level) We have a feeling, this bug "is back", but we can't find the exact spot in the code:https://github.com/spring-cloud/spring-cloud-stream/issues/1536奇怪的是,输入消息的反序列化似乎工作得很好,无论我们在消费者属性中放入什么(在绑定器或默认 kafka 级别)我们有一种感觉,这个错误“回来了”,但我们不能在代码中找到确切的位置:https ://github.com/spring-cloud/spring-cloud-stream/issues/1536
Our (ugly) workaround:我们的(丑陋的)解决方法:
@Slf4j
@Configuration
public class KafkaMessageConverterConfiguration {
@ConditionalOnProperty(value = "spring.cloud.stream.default-binder", havingValue = "kafka")
@Bean
public MessageConverter validationResultConverter(BinderTypeRegistry binder, ObjectMapper objectMapper) {
return new AbstractMessageConverter(MimeType.valueOf("application/json")) {
@Override
protected boolean supports(final Class<?> clazz) {
return ValidationResult.class.isAssignableFrom(clazz);
}
@Override
protected Object convertToInternal(final Object payload, final MessageHeaders headers, final Object conversionHint) {
return payload;
}
};
}
}
Is there a "proper" way of setting the custom serializer or to get the native encoding as it was before?是否有一种“正确”的方式来设置自定义序列化程序或像以前一样获取本机编码?
So this was an issue reported right after 3.0.0.RELEASE - https://github.com/spring-cloud/spring-cloud-stream/commit/74aee8102898dbff96a570d2d2624571b259e141 .所以这是一个在 3.0.0.RELEASE - https://github.com/spring-cloud/spring-cloud-stream/commit/74aee8102898dbff96a570d2d2624571b259e141之后报告的问题。 It has been addressed and will be available in 3.0.1.RELEASE (Horsham.SR1) in few days.它已得到解决,几天后将在 3.0.1.RELEASE (Horsham.SR1) 中可用。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.