简体   繁体   English

Spring 使用 Glue 模式注册表反序列化 AVRO GENERIC_RECORD 启动 kafka 消费者问题

[英]Spring boot kafka consumer issue deserializing AVRO GENERIC_RECORD using Glue schema registry

I have topics written by kafka connect that are in AVRO GENERIC_RECORD format using Glue Schema Registry.我有 kafka connect 使用 Glue Schema Registry 以 AVRO GENERIC_RECORD 格式编写的主题。 I am able to consume those using the documentation using a plain java program.我能够使用普通的 java 程序使用文档来使用那些。 However I am having difficulty reading consuming them using spring boot application.但是,我很难阅读使用 spring 启动应用程序来消费它们。

My simple config class我的简单配置 class

@EnableKafka
@Configuration

public class KafkaAvroConsumerConfig {

    @Value("${spring.kafka.bootstrap-servers}")
    private String brokers;
    @Value("${spring.kafka.consumer.group-id}")
    private String groupId;

    // Creating a Listener
    @Bean
    public ConcurrentKafkaListenerContainerFactory<GenericRecord, GenericRecord> concurrentKafkaListenerContainerFactory() {
        ConcurrentKafkaListenerContainerFactory<GenericRecord, GenericRecord> factory = new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory());
        return factory;
    }

    @Bean
    public ConsumerFactory<GenericRecord, GenericRecord> consumerFactory() {
        return new DefaultKafkaConsumerFactory<>(consumerConfigs());
    }

    @Bean
    public Map<String, Object> consumerConfigs() {
        // Creating a Map of string-object pairs
        Map<String, Object> config = new HashMap<>();

        // Adding the Configuration
        config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, brokers);
        config.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);

        config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, GlueSchemaRegistryKafkaDeserializer.class.getName());
        config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, GlueSchemaRegistryKafkaDeserializer.class.getName());

        config.put(AWSSchemaRegistryConstants.AWS_REGION, region);
        config.put(AWSSchemaRegistryConstants.REGISTRY_NAME, registryName);

        config.put(AWSSchemaRegistryConstants.AVRO_RECORD_TYPE, AvroRecordType.GENERIC_RECORD.getName());
        config.put(AWSSchemaRegistryConstants.SCHEMA_NAMING_GENERATION_CLASS,
                MySchemaNamingStrategy.class.getName());

        return config;
    }
}

And listener class和听众 class

@Component

public class KafkaAvroConsumer {

    @Autowired
    KafkaTemplate<GenericRecord, GenericRecord> kafkaTemplate;

    @KafkaListener(topics = "gsr1.HR.DEPARTMENTS")
    public void listenDepartment(ConsumerRecord<GenericRecord, GenericRecord> record) {

        //System.out.println("DEPARTMENTS key   schema = " + record.key().getSchema().toString());
        GenericRecord key = record.key();
        GenericRecord value = record.value();
        System.out.println("            record.key() = " + key);
        System.out.println("          record.value() = " + value);
        System.out.println("      Key  DEPARTMENT_ID = " + key.get("DEPARTMENT_ID"));
        System.out.println("         DEPARTMENT_NAME = " + (String) value.get("DEPARTMENT_NAME"));
    }

}

This gives me an error at "GenericRecord key = record.key();", looks like they didn't get deserialized to GenericRecord, instead they are just raw bytes这在“GenericRecord key = record.key();”处给我一个错误,看起来它们没有被反序列化为 GenericRecord,而是它们只是原始字节

Caused by: java.lang.ClassCastException: class java.lang.String cannot be cast to class org.apache.avro.generic.GenericRecord (java.lang.String is in module java.base of loader 'bootstrap'; org.apache.avro.generic.GenericRecord is in unnamed module of loader 'app')

I was looking and in spring documentation the DefaultKafkaConsumerFactory method also takes key and value deserialization class also as parameters.我一直在寻找,在 spring 文档中,DefaultKafkaConsumerFactory 方法也将键和值反序列化 class 也作为参数。 So I tried to do this but that doesn't compile.. GlueSchemaRegistryKafkaDeserializer doesn't take type argument either所以我尝试这样做,但没有编译。GlueSchemaRegistryKafkaDeserializer 也不接受类型参数

    public ConsumerFactory<GenericRecord, GenericRecord> consumerFactory() {
        Deserializer<GenericRecord> avroDeser =  new GlueSchemaRegistryKafkaDeserializer();
        avroDeser.configure(consumerConfigs(), false);
        return new DefaultKafkaConsumerFactory<>(consumerConfigs(), avroDeser, avroDeser);
    }

Any help in how to get this to work.关于如何让它工作的任何帮助。 I put the question out in GSR github too https://github.com/awslabs/aws-glue-schema-registry/issues/241我也把问题放在 GSR github https://github.com/awslabs/aws-glue-schema-registry/issues/241

Here is the POM这是POM

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>3.0.1</version>
        <relativePath/> 
    </parent>
    <groupId>com.test</groupId>
    <artifactId>SpringBootKafkaAvro</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>SpringBootKafkaAvro</name>
    <description>Spring boot Kafka Avro using Glue Schema registry</description>
    <properties>
        <java.version>17</java.version>
    </properties>
    <dependencies>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka</artifactId>
        </dependency>
        <dependency>
            <groupId>software.amazon.glue</groupId>
            <artifactId>schema-registry-serde</artifactId>
            <version>1.1.14</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-databind</artifactId>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.datatype</groupId>
            <artifactId>jackson-datatype-jsr310</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka-test</artifactId>
            <scope>test</scope>
        </dependency>
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-maven-plugin</artifactId>
            </plugin>
        </plugins>
    </build>

</project>

I figured out what the issue is.我想出了问题是什么。 In the config class SpringBoot expects the factory bean name to be kafkaListenerContainerFactory .在配置 class 中,SpringBoot 期望工厂 bean 名称为kafkaListenerContainerFactory I named it concurrentKafkaListenerContainerFactory which is causing the issue of not loading the consumer and glue configurations properly.我将其命名为concurrentKafkaListenerContainerFactory ,这会导致无法正确加载消费者和胶水配置的问题。

By default, a bean with name kafkaListenerContainerFactory is expected.默认情况下,需要一个名称为kafkaListenerContainerFactory的 bean。

https://docs.spring.io/spring-kafka/docs/current/reference/html/#kafka-listener-annotation https://docs.spring.io/spring-kafka/docs/current/reference/html/#kafka-listener-annotation

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 MSK Java 生产者/消费者在 AVRO 中具有键和值并使用胶合模式注册表 - MSK Java producer/consumer with both key and value in AVRO and using glue schema registry 将 Glue 模式注册表与 MSK 连接器结合使用 - Using Glue schema registry with MSK Connector 将 AWS glue 模式注册表与融合的 SerDe 客户端一起使用 - Using AWS glue schema registry with confluent SerDe clients 使用适用于 Node.js 或 Ruby on Rails 的 AWS Glue Schema Registry - Using AWS Glue Schema Registry for Node.js or Ruby on Rails Avro 到 BigTable - 架构问题? - Avro to BigTable - Schema issue? FlinkKafkaConsumer / KafkaSource 与 AWS Glue Schema Registry 或 Confluent Schema Registry - FlinkKafkaConsumer / KafkaSource with AWS Glue Schema Registry or Confluent Schema Registry 如何使用 boto3 获取 AWS Glue Schema Registry 架构定义? - How to get AWS Glue Schema Registry schema definition using boto3? Spring 启动 GCP PUBSUB 消费者应用程序 - Spring boot GCP PUBSUB Consumer application Firehose 记录格式转换无法读取从现有架构创建的粘合表架构 - Firehose record format conversion cannot read glue table schema created from existing schema 如何使用 Java 将 Avro 架构发送到 GCP BigQuery? - How to send Avro schema to GCP BigQuery using Java?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM