简体   繁体   English

在 Spring 启动应用程序中使用来自 Confluent 的模式注册表与 Avro 和 Kafka

[英]Using Schema Registry from Confluent with Avro and Kafka in Spring Boot Applications

First of all, I must say I'm not familiar with confluent.首先,我必须说我对 confluent 不熟悉。

I was following this tutorial: https://www.confluent.io/blog/schema-registry-avro-in-spring-boot-application-tutorial/ and I got stuck.我正在关注本教程: https://www.confluent.io/blog/schema-registry-avro-in-spring-boot-application-tutorial/我被卡住了。

I couldn't create the consumer for Kafka because I've received an error: io.confluent.common.config.ConfigException: Missing required configuration "schema.registry.url" which has no default value.我无法为 Kafka 创建消费者,因为我收到一个错误:io.confluent.common.config.ConfigException:缺少没有默认值的必需配置“schema.registry.url”。

I couldn't find this schema property in yml config.我在 yml 配置中找不到这个架构属性。

The confluent is running locally:合流在本地运行:

$: confluent local start
zookeeper is already running. Try restarting if needed
kafka is already running. Try restarting if needed
schema-registry is already running. Try restarting if needed
Starting kafka-rest
kafka-rest is [UP]
Starting connect
connect is [UP]
Starting ksql-server
ksql-server is [UP]
Starting control-center
control-center is [UP]

After I setup users topic in Spring, from control-center I see a different schema:在 Spring 中设置用户主题后,从控制中心我看到一个不同的模式:

{
  "connect.name": "ksql.users",
  "fields": [
    {
      "name": "registertime",
      "type": "long"
    },
    {
      "name": "userid",
      "type": "string"
    },
    {
      "name": "regionid",
      "type": "string"
    },
    {
      "name": "gender",
      "type": "string"
    }
  ],
  "name": "users",
  "namespace": "ksql",
  "type": "record"
}

These are my files:这些是我的文件:

user.avro用户.avro

{"namespace": "com.example.demo.model",
 "type": "record",
 "name": "User",
 "fields": [
     {"name": "name", "type": "string"},
     {"name": "age",  "type": "int"}
 ]
}

pom.xml pom.xml

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>2.3.1.RELEASE</version>
        <relativePath/> <!-- lookup parent from repository -->
    </parent>
    <groupId>com.example</groupId>
    <artifactId>demo</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>demo</name>
    <description>Demo project for Spring Boot</description>

    <properties>
        <java.version>11</java.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency>

        <dependency>
            <groupId>org.projectlombok</groupId>
            <artifactId>lombok</artifactId>
            <optional>true</optional>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
            <exclusions>
                <exclusion>
                    <groupId>org.junit.vintage</groupId>
                    <artifactId>junit-vintage-engine</artifactId>
                </exclusion>
            </exclusions>
        </dependency>






<!--        <dependency>-->
<!--            <groupId>org.apache.avro</groupId>-->
<!--            <artifactId>avro</artifactId>-->
<!--            <version>1.10.0</version>-->
<!--        </dependency>-->

        <!-- other dependencies -->
        <dependency>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka</artifactId>
        </dependency>
        <dependency>
            <groupId>io.confluent</groupId>

            <artifactId>kafka-schema-registry-client</artifactId>
            <version>5.3.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.avro</groupId>
            <artifactId>avro</artifactId>
            <version>1.10.0</version>
        </dependency>
        <dependency>
            <groupId>io.confluent</groupId>
            <artifactId>kafka-avro-serializer</artifactId>
            <version>5.2.1</version>
        </dependency>
        <dependency>
            <groupId>io.confluent</groupId>
            <artifactId>kafka-streams-avro-serde</artifactId>
            <version>5.3.0</version>
            <exclusions>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-log4j12</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

    </dependencies>

    <repositories>
        <!-- other maven repositories the project -->
        <repository>
            <id>confluent</id>
            <url>https://packages.confluent.io/maven/</url>
        </repository>
    </repositories>

    <build>
        <plugins>

            <plugin>
                <groupId>org.apache.avro</groupId>
                <artifactId>avro-maven-plugin</artifactId>
                <version>1.10.0</version>
                <executions>
                    <execution>
                        <phase>generate-sources</phase>
                        <goals>
                            <goal>schema</goal>
                        </goals>
                        <configuration>
                            <sourceDirectory>${project.basedir}/src/main/avro/</sourceDirectory>
                            <outputDirectory>${project.basedir}/src/main/java/</outputDirectory>
                        </configuration>
                    </execution>
                </executions>
            </plugin>

            <plugin>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-maven-plugin</artifactId>
            </plugin>
        </plugins>
    </build>

</project>

DemoApplication.java演示Application.java

package com.example.demo;

import org.apache.kafka.clients.admin.NewTopic;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;

@SpringBootApplication
public class DemoApplication {

    @Value("${topic.name}")
    private String topicName;

    @Value("${topic.partitions-num}")
    private Integer partitions;

    @Value("${topic.replication-factor}")
    private short replicationFactor;


    @Bean
    NewTopic moviesTopic() {
        return new NewTopic(topicName, partitions, replicationFactor);
    }

    public static void main(String[] args) {
        SpringApplication.run(DemoApplication.class, args);
    }

}

Consumer.java消费者.java

package com.example.demo.kafka;

import com.example.demo.model.User;
import lombok.extern.apachecommons.CommonsLog;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.stereotype.Service;

@Service
@CommonsLog(topic = "Consumer Logger")
public class Consumer {


    @Value("${topic.name}")
    private String topicName;

    @KafkaListener(topics = "users", groupId = "group_id")
    public void consume(ConsumerRecord<String, User> record) {
        log.info(String.format("Consumed message -> %s", record.value()));
    }
}

KafkaController Kafka控制器

package com.example.demo.kafka;

import com.example.demo.model.User;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;

@RestController
@RequestMapping(value = "/user")
public class KafkaController {

    private final Producer producer;

    @Autowired
    KafkaController(Producer producer) {
        this.producer = producer;
    }

    @PostMapping(value = "/publish")
    public void sendMessageToKafkaTopic(@RequestParam("name") String name, @RequestParam("age") Integer age) {
        this.producer.sendMessage(new User(name, age));
    }
}

Producer.java制作人.java

package com.example.demo.kafka;

import com.example.demo.model.User;
import lombok.extern.apachecommons.CommonsLog;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.stereotype.Service;

@Service
@CommonsLog(topic = "Producer Logger")
public class Producer {

    @Value("${topic.name}")
    private String TOPIC;

    private final KafkaTemplate<String, User> kafkaTemplate;

    @Autowired
    public Producer(KafkaTemplate<String, User> kafkaTemplate) {
        this.kafkaTemplate = kafkaTemplate;
    }

    void sendMessage(User user) {
        this.kafkaTemplate.send(this.TOPIC, user.getName().toString(), user);
        log.info(String.format("Produced user -> %s", user));
    }
}

application.yml应用.yml

server:
  port: 8080
spring:
  kafka:
    consumer:
      bootstrap-servers: localhost:9092
      group-id: group_id
      auto-offset-reset: earliest
      key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value-deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
    producer:
      bootstrap-servers: localhost:9092
      key-serializer: org.apache.kafka.common.serialization.StringSerializer
      value-serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
    bootstrap-servers: localhost:9092



topic:
  name: users
  partitions-num: 1
  replication-factor: 1


Adding this property, even if IntelliJ doesn't recognise it:添加此属性,即使 IntelliJ 无法识别它:

spring:
  kafka:
    properties:
      # default url for schema registry is localhost:8081
      schema.registry.url: http://localhost:8081

Step 2: Be sure, the schema registry from the current topic si the same avro file from project.第 2 步:确保当前主题中的模式注册表与项目中的相同 avro 文件相同。 (Select None Compatibility if you have problems at saving) (如果您在保存时遇到问题,请选择无兼容性)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Spring Kafka + Json + Schema Registry Confluent - Spring Kafka + Json + Schema Registry Confluent Spring Kafka 与 Confluent Kafka Avro 反序列化器 - Spring Kafka with Confluent Kafka Avro Deserializer 使用 spring 引导在模式注册表中注册模式 - Registering a schema in schema registry using spring boot 如何禁用/启用 - 使用 Spring 引导配置文件的 Confluent KAFKA 拦截器? - How to disable/enable - Confluent KAFKA Interceptor using Spring boot profile? 带有融合架构注册表的Spring Cloud Stream Kafka无法将消息发送到通道输出 - Spring Cloud Stream Kafka with confluent schema registry failed to send Message to channel output Spring Kafka Test with Confluent Kafka Avro Serializer 找不到 ZKClientConfig - Spring Kafka Test with Confluent Kafka Avro Serializer cannot find ZKClientConfig 使用 Avro Schema 注册表的 Kafka 消费者单元测试失败 - Kafka consumer unit test with Avro Schema registry failing Spring-cloud kafka stream 模式注册表 - Spring-cloud kafka stream schema registry Spring Cloud Stream Confluent KStream Avro Consume - Spring cloud stream Confluent KStream Avro Consume Kafka spring boot 应用程序生产者,无法使用 Avro 格式的 Kafka Sink Connector 反映这一点 - Kafka spring boot application producer and unable to reflect that with Kafka Sink Connector with Avro format
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM