简体   繁体   English

Spring-Cloud-Stream-Kafka-Binder 函数风格忽略自定义 De/Serializer 和/或 useNativeEncoding?

[英]Spring-Cloud-Stream-Kafka-Binder functional style ignores custom De/Serializer and/or useNativeEncoding?

We're just upgrading to the 3.0.0-Release of Spring-Cloud-Stream and encounter the following problems:我们刚刚升级到Spring-Cloud-Stream 3.0.0-Release,遇到以下问题:

When using the functional style like this:当使用这样的功能样式时:

public class EventProcessor {

    private final PriceValidator priceValidator;

    @Bean
    public Function<Flux<EnrichedValidationRequest>, Flux<ValidationResult>> validate() {
        return enrichedValidationRequestFlux -> enrichedValidationRequestFlux
                .map(ProcessingContext::new)
                .flatMap(priceValidator::validateAndMap);
    }
}

The application.yaml looks like this: application.yaml 看起来像这样:

spring.cloud.stream:
  default-binder: kafka
  kafka:
    binder:
      brokers: ${kafka.broker.prod}
      auto-create-topics: false
  function.definition: validate

# INPUT: enrichedValidationRequests
spring.cloud.stream.bindings.validate-in-0:
  destination: ${kafka.topic.${spring.application.name}.input.enrichedValidationRequests}
  group: ${spring.application.name}.${STAGE:NOT_SET}
  consumer:
    useNativeDecoding: true


spring.cloud.stream.kafka.bindings.validate-in-0:
  consumer:
    configuration:
      key.deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value.deserializer: de.pricevalidator.deserializer.EnrichedValidationRequestDeserializer


# OUTPUT: validationResults
spring.cloud.stream.bindings.validate-out-0:
  destination: validationResultsTmp
  producer:
    useNativeEncoding: true

spring.cloud.stream.kafka.bindings.validate-out-0:
  producer:
    compression.type: lz4
    messageKeyExpression: payload.offerKey
    configuration:
      key.serializer: org.apache.kafka.common.serialization.StringSerializer
      value.serializer: de.pricevalidator.serializer.ValidationResultSerializer

It seems like the serialization is done twice - when we intercept the messages that get produced in the kafka topic, the consumer would just display them as JSON (strings), but now it's an unreadable byte[].似乎序列化完成了两次——当我们拦截在 kafka 主题中生成的消息时,消费者只会将它们显示为 JSON(字符串),但现在它是一个不可读的字节 []。 Also, the downstream consumers in production can't deserialize the messages anymore.此外,生产中的下游消费者不能再对消息进行反序列化。 Strangely enough, the deserialization of the input messages seems to work just fine, no matter what we put into the consumer properties (either on binder or on default kafka level) We have a feeling, this bug "is back", but we can't find the exact spot in the code:https://github.com/spring-cloud/spring-cloud-stream/issues/1536奇怪的是,输入消息的反序列化似乎工作得很好,无论我们在消费者属性中放入什么(在绑定器或默认 kafka 级别)我们有一种感觉,这个错误“回来了”,但我们不能在代码中找到确切的位置:https ://github.com/spring-cloud/spring-cloud-stream/issues/1536

Our (ugly) workaround:我们的(丑陋的)解决方法:

@Slf4j
@Configuration
public class KafkaMessageConverterConfiguration {

    @ConditionalOnProperty(value = "spring.cloud.stream.default-binder", havingValue = "kafka")
    @Bean
    public MessageConverter validationResultConverter(BinderTypeRegistry binder, ObjectMapper objectMapper) {
        return new AbstractMessageConverter(MimeType.valueOf("application/json")) {
            @Override
            protected boolean supports(final Class<?> clazz) {
                return ValidationResult.class.isAssignableFrom(clazz);
            }

            @Override
            protected Object convertToInternal(final Object payload, final MessageHeaders headers, final Object conversionHint) {
                return payload;
            }
        };
    }
}

Is there a "proper" way of setting the custom serializer or to get the native encoding as it was before?是否有一种“正确”的方式来设置自定义序列化程序或像以前一样获取本机编码?

So this was an issue reported right after 3.0.0.RELEASE - https://github.com/spring-cloud/spring-cloud-stream/commit/74aee8102898dbff96a570d2d2624571b259e141 .所以这是一个在 3.0.0.RELEASE - https://github.com/spring-cloud/spring-cloud-stream/commit/74aee8102898dbff96a570d2d2624571b259e141之后报告的问题。 It has been addressed and will be available in 3.0.1.RELEASE (Horsham.SR1) in few days.它已得到解决,几天后将在 3.0.1.RELEASE (Horsham.SR1) 中可用。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Spring Cloud Stream自定义活页夹未注册。 如果使用@Configuration,则禁用kafka绑定程序 - Spring cloud stream custom binder not registered. Disables the kafka binder if used @Configuration Spring Cloud Stream Kafka活页夹压缩 - Spring Cloud Stream Kafka binder compression 如何使用 spring-cloud-stream-binder-kafka-streams:3.1.1 中的功能方法检索/设置 header - How to retrieve/set header using functional approach in spring-cloud-stream-binder-kafka-streams:3.1.1 空指针:带有活页夹 kafka 的 Spring Cloud 流 - Null pointer: Spring cloud stream with binder kafka 如何将 Spring Cloud Stream Functional Beans 连接到 Kafka Binder? - How do I Connect Spring Cloud Stream Functional Beans to a Kafka Binder? Spring Stream kafka Binder 测试自定义标头 - Spring Stream kafka Binder Test Custom Headers Kafka中的JSON错误-Spring Cloud Stream Kafka Binder - Bad JSON in Kafka - Spring Cloud Stream Kafka Binder Spring Cloud Stream Kafka Binder 运行时配置更新 - Spring Cloud Stream Kafka Binder Configuration update at runtime 如何处理 Spring 云 stream kafka 流活页夹中的序列化错误? - How to handle Serialization error in Spring cloud stream kafka streams binder? 使用 spring-cloud-stream-binder-kafka 将 GlobalStateStore 绑定到处理器中 - Binding GlobalStateStore into Processor with spring-cloud-stream-binder-kafka
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM