简体   繁体   English

Spring Cloud Stream Kafka>使用Confluent REST Proxy消费Avro消息

[英]Spring Cloud Stream Kafka > consuming Avro messages from Confluent REST Proxy

I have the following scenario: 我有以下场景:

My application looks like this: 我的应用程序如下所示:

@SpringBootApplication
@EnableBinding(Sink.class)
public class MyApplication {
  private static Logger log = LoggerFactory.getLogger(MyApplication.class);

  public static void main(String[] args) {
    SpringApplication.run(MyApplication.class, args);
  }

  @StreamListener(Sink.INPUT)
  public void myMessageSink(MyMessage message) {
    log.info("Received new message: {}", message);
  }
}

Whereas MyMessage is the class created by Avro from the Avro schema. 而MyMessage是Avro从Avro架构创建的类。

My application.properties looks like this: 我的application.properties看起来像这样:

spring.cloud.stream.bindings.input.destination=myTopic
spring.cloud.stream.bindings.input.group=${spring.application.name}
spring.cloud.stream.bindings.input.contentType=application/*+avro

My problem is now that every time a new message is received, the following exception is thrown: 我的问题是,每次收到新消息时,都会抛出以下异常:

org.springframework.messaging.MessagingException: Exception thrown while invoking MyApplication#myMessageSink[1 args]; nested exception is org.apache.avro.AvroRuntimeException: Malformed data. Length is negative: -27
    at org.springframework.cloud.stream.binding.StreamListenerAnnotationBeanPostProcessor$StreamListenerMessageHandler.handleRequestMessage(StreamListenerAnnotationBeanPostProcessor.java:316) ~[spring-cloud-stream-1.1.0.RELEASE.jar:1.1.0.RELEASE]
    at org.springframework.integration.handler.AbstractReplyProducingMessageHandler.handleMessageInternal(AbstractReplyProducingMessageHandler.java:109) ~[spring-integration-core-4.3.2.RELEASE.jar:4.3.2.RELEASE]
    at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:127) ~[spring-integration-core-4.3.2.RELEASE.jar:4.3.2.RELEASE]
    at org.springframework.integration.dispatcher.AbstractDispatcher.tryOptimizedDispatch(AbstractDispatcher.java:116) ~[spring-integration-core-4.3.2.RELEASE.jar:4.3.2.RELEASE]
    at org.springframework.integration.dispatcher.UnicastingDispatcher.doDispatch(UnicastingDispatcher.java:148) ~[spring-integration-core-4.3.2.RELEASE.jar:4.3.2.RELEASE]
    ...
Caused by: org.apache.avro.AvroRuntimeException: Malformed data. Length is negative: -27
    at org.apache.avro.io.BinaryDecoder.doReadBytes(BinaryDecoder.java:336) ~[avro-1.8.1.jar:1.8.1]
    at org.apache.avro.io.BinaryDecoder.readString(BinaryDecoder.java:263) ~[avro-1.8.1.jar:1.8.1]
    at org.apache.avro.io.ResolvingDecoder.readString(ResolvingDecoder.java:201) ~[avro-1.8.1.jar:1.8.1]
    at org.apache.avro.generic.GenericDatumReader.readString(GenericDatumReader.java:430) ~[avro-1.8.1.jar:1.8.1]
    at org.apache.avro.generic.GenericDatumReader.readString(GenericDatumReader.java:422) ~[avro-1.8.1.jar:1.8.1]
    at org.apache.avro.generic.GenericDatumReader.readMapKey(GenericDatumReader.java:335) ~[avro-1.8.1.jar:1.8.1]
    at org.apache.avro.generic.GenericDatumReader.readMap(GenericDatumReader.java:321) ~[avro-1.8.1.jar:1.8.1]
    at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:177) ~[avro-1.8.1.jar:1.8.1]
    at org.apache.avro.specific.SpecificDatumReader.readField(SpecificDatumReader.java:116) ~[avro-1.8.1.jar:1.8.1]
    at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:230) ~[avro-1.8.1.jar:1.8.1]
    at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:174) ~[avro-1.8.1.jar:1.8.1]
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:152) ~[avro-1.8.1.jar:1.8.1]
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:144) ~[avro-1.8.1.jar:1.8.1]
    at org.springframework.cloud.stream.schema.avro.AbstractAvroMessageConverter.convertFromInternal(AbstractAvroMessageConverter.java:91) ~[spring-cloud-stream-schema-1.1.0.RELEASE.jar:1.1.0.RELEASE]
    at org.springframework.messaging.converter.AbstractMessageConverter.fromMessage(AbstractMessageConverter.java:175) ~[spring-messaging-4.3.3.RELEASE.jar:4.3.3.RELEASE]
    at org.springframework.messaging.converter.CompositeMessageConverter.fromMessage(CompositeMessageConverter.java:67) ~[spring-messaging-4.3.3.RELEASE.jar:4.3.3.RELEASE]
    at org.springframework.messaging.handler.annotation.support.PayloadArgumentResolver.resolveArgument(PayloadArgumentResolver.java:117) ~[spring-messaging-4.3.3.RELEASE.jar:4.3.3.RELEASE]
    at org.springframework.messaging.handler.invocation.HandlerMethodArgumentResolverComposite.resolveArgument(HandlerMethodArgumentResolverComposite.java:112) ~[spring-messaging-4.3.3.RELEASE.jar:4.3.3.RELEASE]
    at org.springframework.messaging.handler.invocation.InvocableHandlerMethod.getMethodArgumentValues(InvocableHandlerMethod.java:138) ~[spring-messaging-4.3.3.RELEASE.jar:4.3.3.RELEASE]
    at org.springframework.messaging.handler.invocation.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:107) ~[spring-messaging-4.3.3.RELEASE.jar:4.3.3.RELEASE]
    at org.springframework.cloud.stream.binding.StreamListenerAnnotationBeanPostProcessor$StreamListenerMessageHandler.handleRequestMessage(StreamListenerAnnotationBeanPostProcessor.java:307) ~[spring-cloud-stream-1.1.0.RELEASE.jar:1.1.0.RELEASE]
    ... 35 common frames omitted

From what I understand, the problem is that the Confluent stack includes the ID of the schema of the message as part of the message payload and clients are expected to start reading the actual Avro message after the schema ID. 据我所知,问题在于Confluent堆栈包含消息模式的ID作为消息有效负载的一部分,并且客户端应该在模式ID之后开始读取实际的Avro消息。 It seems I need to configure the Kafka binding to use Confluent's KafkaAvroDeserializer but I cannot figure out how to achieve this. 我似乎需要配置Kafka绑定以使用Confluent的KafkaAvroDeserializer,但我无法弄清楚如何实现这一点。

(I can perfectly fine retrieve the messages using Confluent's avro console consumer so it doesn't seem to be an issue with the Avro encoding) (我可以使用Confluent的avro控制台消费者完全正确地检索消息,因此它似乎不是Avro编码的问题)

I also tried playing around with the @EnableSchemaRegistry annotation and configuring a ConfluentSchemaRegistryClient bean, but it looks to me like this does only control where the schemas are stored/retrieved from but not the actual deserialization. 我也尝试使用@EnableSchemaRegistry注释并配置ConfluentSchemaRegistryClient bean,但它在我看来这样只能控制存储/检索模式的位置,而不是实际的反序列化。

Is this even supposed to be working somehow? 这甚至应该以某种方式起作用吗?

在设置per-binding属性spring.cloud.stream.kafka.bindings.input.consumer.configuration.value.deserializer以使Confluent's KafkaAvroDeserializer class name是否有效?

Sort of answering my own question. 回答我自己的问题。 What I did for now is implement a MessageConverter which just removes the first 4 bytes of any message before passing them to the Avro decoder. 我现在所做的是实现一个MessageConverter,它只是删除任何消息的前4个字节,然后再将它们传递给Avro解码器。 The code is mostly taken from spring-cloud-stream's AbstractAvroMessageConverter: 代码主要来自spring-cloud-stream的AbstractAvroMessageConverter:

public class ConfluentAvroSchemaMessageConverter extends AvroSchemaMessageConverter {

public ConfluentAvroSchemaMessageConverter() {
    super(new MimeType("application", "avro+confluent"));
}

@Override
protected Object convertFromInternal(Message<?> message, Class<?> targetClass, Object conversionHint) {
    Object result = null;
    try {
        byte[] payload = (byte[]) message.getPayload();

        // byte array to contain the message without the confluent header (first 4 bytes)
        byte[] payloadWithoutConfluentHeader = new byte[payload.length - 4];
        ByteBuffer buf = ByteBuffer.wrap(payload);
        MimeType mimeType = getContentTypeResolver().resolve(message.getHeaders());
        if (mimeType == null) {
            if (conversionHint instanceof MimeType) {
                mimeType = (MimeType) conversionHint;
            }
            else {
                return null;
            }
        }

        // read first 4 bytes and copy the rest to the new byte array
        // see https://groups.google.com/forum/#!topic/confluent-platform/rI1WNPp8DJU
        buf.getInt();
        buf.get(payloadWithoutConfluentHeader);

        Schema writerSchema = resolveWriterSchemaForDeserialization(mimeType);
        Schema readerSchema = resolveReaderSchemaForDeserialization(targetClass);
        DatumReader<Object> reader = getDatumReader((Class<Object>) targetClass, readerSchema, writerSchema);
        Decoder decoder = DecoderFactory.get().binaryDecoder(payloadWithoutConfluentHeader, null);
        result = reader.read(null, decoder);
    }
    catch (IOException e) {
            throw new MessageConversionException(message, "Failed to read payload", e);
    }
    return result;

}

I then set the content type for the incoming Kafka topic to application/avro+confluent via application.properties. 然后,我通过application.properties将传入的Kafka主题的内容类型设置为application / avro + confluent。

This at least lets me retrieve messages encoded with the Confluent stack, but of course it does not interact with the schema registry in any way. 这至少让我检索使用Confluent堆栈编码的消息,但当然它不会以任何方式与架构注册表交互。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 以 avro 格式使用来自 apache kafka 的消息 - Consuming messages from apache kafka in avro format 使用 Spring Cloud Stream Kafka Binder 批量消费 Kafka 消息及其密钥 - Consuming Kafka messages with its key in batches using Spring Cloud Stream Kafka Binder 默认情况下如何使用Kafka Spring Cloud Stream并使用汇合API生成的Kafka消息? - How to consume from Kafka Spring Cloud Stream by default and also consume a Kafka message generated by the confluent API? 在 Spring 启动应用程序中使用来自 Confluent 的模式注册表与 Avro 和 Kafka - Using Schema Registry from Confluent with Avro and Kafka in Spring Boot Applications Spring Kafka 与 Confluent Kafka Avro 反序列化器 - Spring Kafka with Confluent Kafka Avro Deserializer 带有 Confluent 的 Spring Cloud Stream Kafka 与带有 Confluent 的 Spring Kafka 产生的消息不同 - Spring Cloud Stream Kafka with Confluent is not producing same message as Spring Kafka with Confluent Spring Kafka、Spring Cloud Stream 和 Avro 兼容性未知魔术字节 - Spring Kafka, Spring Cloud Stream, and Avro compatibility Unknown magic byte Spring Cloud Stream Kafka生产者消息 - Spring Cloud Stream Kafka Producer messages 无法使用Confluent Schema Registry和Spring Cloud Streams反序列化Avro消息 - Unable to deserialize Avro message using Confluent Schema Registry and Spring Cloud Stream Spring Kafka Test with Confluent Kafka Avro Serializer 找不到 ZKClientConfig - Spring Kafka Test with Confluent Kafka Avro Serializer cannot find ZKClientConfig
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM