简体   繁体   English

带有 Spring-boot 的 Kafka:消费者通过 ConsumerRecord 消费 java.lang.Object(任何/所有对象)

[英]Kafka with Spring-boot: Consumer to consume java.lang.Object( Any/all object) via ConsumerRecord

So I am using spring-boot2.1.6 and integrating kafka consumer to consume any type of message being published on the topic.所以我正在使用 spring-boot2.1.6 并集成 kafka 消费者来消费在主题上发布的任何类型的消息。 For reference I am following https://docs.spring.io/spring-boot/docs/2.1.6.RELEASE/reference/htmlsingle/作为参考,我正在关注https://docs.spring.io/spring-boot/docs/2.1.6.RELEASE/reference/htmlsingle/

So I have dependency in my pom:所以我在我的pom中有依赖:

    <dependency>
        <groupId>org.springframework.kafka</groupId>
        <artifactId>spring-kafka</artifactId>
    </dependency>

I am configuring in application.yml我在 application.yml 中配置

spring:
  kafka:
    bootstrap-servers: localhost:9092
    consumer:
      group-id: foo
      auto-offset-reset: earliest
      key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
      properties:
        spring:
          json:
            value:
              default:
                type: java.lang.Object

And at last here is my listener code:最后这是我的监听器代码:

 @KafkaListener(topics = "videoEnrichedEvents")
    public void consume(@Payload VideoEnrichedEventsvideoEnrichedEvents){
        LOGGER.debug("Consumed message :"+videoEnrichedEvents);
        System.out.println("Consumed Message :"videoEnrichedEvents);
    }

Since I have different topic and different consumers for it, I want the consumer configurations to be generic enough so that I can read any object and then delegate it to the processing handler.由于我有不同的主题和不同的消费者,我希望消费者配置足够通用,以便我可以读取任何 object,然后将其委托给处理处理程序。 In the error logs I could see:在错误日志中,我可以看到:

Caused by: org.springframework.messaging.converter.MessageConversionException: Cannot handle message; nested exception is org.springframework.messaging.converter.MessageConversionException: Cannot convert from [java.util.LinkedHashMap] to [com.calamp.connect.vs.model.VideoEnrichedEvents] for GenericMessage [payload={anyotherjson={groups=null, id=0, driverName=from Kusum's console, deviceIdType=null, assetId=null, operatorId=null, avlEventTime=null, videoLink=null, tripId=null, avlEventUuid=null, deviceId=null, appMessageUuid=null, parentAccountList=null, appmsgEventTime=null, enrichedMessage=null, accountId=null}}, headers={kafka_offset=9, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer@18213932, kafka_timestampType=CREATE_TIME, kafka_receivedMessageKey=null, kafka_receivedPartitionId=0, kafka_receivedTopic=videoEnrichedEvents, kafka_receivedTimestamp=1590218109430}], failedMessage=GenericMessage [payload={anyotherjson={groups=null, id=0, driverName=from Kusum's console, deviceIdType=null, assetId=null, operatorId=null, avlEventTime=null, videoLink=null, tripId=null, avlEventUuid=null, deviceId=null, appMessageUuid=null, parentAccountList=null, appmsgEventTime=null, enrichedMessage=null, accountId=null}}, headers={kafka_offset=9, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer@18213932, kafka_timestampType=CREATE_TIME, kafka_receivedMessageKey=null, kafka_receivedPartitionId=0, kafka_receivedTopic=videoEnrichedEvents, kafka_receivedTimestamp=1590218109430}]

After little googling I found out that ConsumerRecord was used instead of LinkedHashMap everywhere.经过一番谷歌搜索后,我发现到处都使用了 ConsumerRecord 而不是 LinkedHashMap。
And my new code looks like:我的新代码如下所示:

 @KafkaListener(topics = "videoEnrichedEvents")
    public void consume(@Payload ConsumerRecord consumerRecord){
        LOGGER.debug("Consumed message!!!Full :"+consumerRecord);
        System.out.println("Consumed Message!!! Actual object :"+((LinkedHashMap)consumerRecord.value()));
    }

It technically handles any object sent to me.它在技术上处理发送给我的任何 object。 So it solves my purpose.所以它解决了我的目的。 But my question is why ConsumerRecord and not LinkedHashMap?但我的问题是为什么 ConsumerRecord 而不是 LinkedHashMap? any specific reason?有什么具体原因吗?

The simplest way to deal with this is to use a ByteArrayDeserializer together with a ByteArrayJsonMessageConverter bean (simply add it to the application context and Boot will wire it in).处理这个问题的最简单方法是将ByteArrayDeserializerByteArrayJsonMessageConverter bean 一起使用(只需将其添加到应用程序上下文中,Boot 就会将其连接进去)。

This way, the conversion from JSON is deferred until right before we call the method so we know what the target type is.这样,从 JSON 的转换被推迟到我们调用该方法之前,所以我们知道目标类型是什么。

See https://docs.spring.io/spring-kafka/docs/2.5.0.RELEASE/reference/html/#messaging-message-conversion请参阅https://docs.spring.io/spring-kafka/docs/2.5.0.RELEASE/reference/html/#messaging-message-conversion

NOTE: This method can't be used with a class-level listener and @KafkaHandler methods because in that case, we use the type to select which method to call.注意:此方法不能与类级别的侦听器和@KafkaHandler方法一起使用,因为在这种情况下,我们使用 select 的类型来调用哪个方法。

The method signature is flexible and can take a ConsumerRecord or the deserialised object contained in the record.方法签名很灵活,可以采用 ConsumerRecord 或记录中包含的反序列化 object。 The latter relies on deserialization converting the incoming message to the type in the method signature.后者依赖于将传入消息转换为方法签名中的类型的反序列化。 If Jackson cannot determine what type the incoming JSON message is then it will deserialize to a HashMap, as JSON is effectively just a map and so it is providing exactly what you asked for - an Object (where a LinkedHashMap is the only Object it is able to create with the available information). If Jackson cannot determine what type the incoming JSON message is then it will deserialize to a HashMap, as JSON is effectively just a map and so it is providing exactly what you asked for - an Object (where a LinkedHashMap is the only Object it is able使用可用信息进行创建)。

Hence the behaviour you are seeing is because the deserializer is not able to deserialize the message to a specific class, and so a method signature can accept either a Map, or a ConsumerRecord - where ConsumerRecord is a valid argument for any message regardless of deserialization.因此,您看到的行为是因为反序列化程序无法将消息反序列化为特定的 class,因此方法签名可以接受 Map 或 ConsumerRecord - 其中 ConsumerRecord 是任何消息的有效参数,无论反序列化如何。

If you want to handle different types in this way, best to use a custom Deserializer that can look at some aspect of the message and create an instance of the correct class, specify that Deserializer instead of the JsonDeserializer you have in your yaml.如果您想以这种方式处理不同的类型,最好使用自定义反序列化器,它可以查看消息的某些方面并创建正确的 class 的实例,指定反序列化器而不是您在 yaml 中拥有的 JsonDeserializer。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 通过java.lang.Object检测数组 - Instrumenting Array via java.lang.Object Java中的对象不可序列化(org.apache.kafka.clients.consumer.ConsumerRecord)spark kafka流 - Object not serializable (org.apache.kafka.clients.consumer.ConsumerRecord) in Java spark kafka streaming Kafka教程:java.lang.Object无法解决 - Kafka tutorial: java.lang.Object cannot be resolved java.lang.Object; 无法转换为模型类:Spring Boot 中的错误 - java.lang.Object; cannot be cast to model class: error in Spring Boot Spring Boot:MongoDb:异常:NoSuchMethodError:com.mongodb.client.MongoCollection.insertOne(java.lang.Object) - Spring Boot: MongoDb: Exception: NoSuchMethodError: com.mongodb.client.MongoCollection.insertOne(java.lang.Object) XmlObject []到java.lang.Object [] - XmlObject [] to java.lang.Object[] Spring Data:不是托管类型:class java.lang.Object - Spring Data: Not an managed type: class java.lang.Object Spring UnsatisfiedDependencyException而不是托管类型:类java.lang.Object - Spring UnsatisfiedDependencyException and Not a managed type: class java.lang.Object ArangoDB Spring Data 2,java.lang.Object/无效的地图类型 - ArangoDB Spring Data 2, java.lang.Object / invalid map type spring-data:不是托管类型:class java.lang.Object - spring-data: Not an managed type: class java.lang.Object
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM