简体   繁体   English

Spring-cloud kafka stream 模式注册表

[英]Spring-cloud kafka stream schema registry

I am trying to transform with functionnal programming (and spring cloud stream) an input AVRO message from an input topic, and publish a new message on an output topic.我正在尝试使用功能编程(和 spring 云流)转换来自输入主题的输入 AVRO 消息,并在 output 主题上发布新消息。 Here is my transform function:这是我的变换 function:

@Bean
public Function<KStream<String, Data>, KStream<String, Double>> evenNumberSquareProcessor() {
    return kStream -> kStream.transform(() -> new CustomProcessor(STORE_NAME), STORE_NAME);
}

The CustomProcessor is a class that implements the "Transformer" interface. CustomProcessor 是一个实现“Transformer”接口的 class。

I have tried the transformation with non AVRO input and it works fine.我已经尝试使用非 AVRO 输入进行转换,并且效果很好。

My difficulties is how to declare the schema registry in the application.yaml file or in the the spring application.我的困难是如何在 application.yaml 文件或 spring 应用程序中声明模式注册表。

I have tried a lot of different configurations (it seems difficult to find the right documentation) and each time the application don't find the settings for the schema.registry.url.我尝试了很多不同的配置(似乎很难找到正确的文档),每次应用程序都找不到 schema.registry.url 的设置。 I have the following error:我有以下错误:

Error creating bean with name 'kafkaStreamsFunctionProcessorInvoker': Invocation of init method failed;创建名为“kafkaStreamsFunctionProcessorInvoker”的 bean 时出错:调用 init 方法失败; nested exception is java.lang.IllegalStateException: org.apache.kafka.common.config.ConfigException: Missing required configuration "schema.registry.url" which has no default value.嵌套异常是 java.lang.IllegalStateException: org.apache.kafka.common.config.ConfigException: Missing required configuration "schema.registry.url" 没有默认值。

Here is my application.yml file:这是我的 application.yml 文件:

    spring:
  cloud:
    stream:
      function:
        definition: evenNumberSquareProcessor
      bindings:
        evenNumberSquareProcessor-in-0:
          destination: input
          content-type: application/*+avro
          group: group-1
        evenNumberSquareProcessor-out-0:
          destination: output
      kafka:
        binder:
          brokers: my-cluster-kafka-bootstrap.kafka:9092
          consumer-properties:
            value.deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
            schema.registry.url: http://localhost:8081

I have tried this configuration too:我也试过这个配置:

spring:
  cloud:
    stream:
      kafka:
        streams:
          binder:
            brokers: my-cluster-kafka-bootstrap.kafka:9092
            configuration:
              schema.registry.url: http://localhost:8081
              default.value.serde: io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde
          bindings:
            evenNumberSquareProcessor-in-0:
              consumer:
                destination: input
                valueSerde: io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde
            evenNumberSquareProcessor-out-0:
                destination: output

My spring boot application is declared in this way, with the activation of the schema registry client:我的 spring 启动应用程序是这样声明的,并激活了模式注册表客户端:

    @EnableSchemaRegistryClient
@SpringBootApplication
public class TransformApplication {
    public static void main(String[] args) {
        SpringApplication.run(TransformApplication.class, args);
    }
}

Thanks for any help you could bring to me.感谢您能给我带来的任何帮助。

Regards CG问候CG

Configure the schema registry under the configuration then it will be available to all binders.在配置下configuration模式注册表,然后它将可供所有活页夹使用。 By the way.顺便一提。 The avro serializer is under the bindings and the specific channel. avro 序列化程序位于bindings和特定通道下。 If you want use the default property default.value.serde: .如果你想使用默认属性default.value.serde: Your Serde might be the wrong too.您的 Serde 也可能是错误的。

spring:
  cloud:
    stream:
      kafka:
        streams:
          binder:
            brokers: localhost:9092
            configuration:
              schema.registry.url: http://localhost:8081
              default.value.serde: io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde
          bindings:
            process-in-0:
              consumer:
                valueSerde: io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde

Don't use the @EnableSchemaRegistryClient .不要使用@EnableSchemaRegistryClient Enable the schema registry on the Avro Serde.在 Avro Serde 上启用模式注册表。 In this example, I am using the bean Data of your definition.在此示例中,我使用的是您定义的 bean Data Try to follow this example here .尝试在此处遵循此示例

@Service
public class CustomSerdes extends Serdes {

    private final static Map<String, String> serdeConfig = Stream.of(
            new AbstractMap.SimpleEntry<>(SCHEMA_REGISTRY_URL_CONFIG, "http://localhost:8081"))
            .collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue));

    public static Serde<Data> DataAvro() {
        final Serde<Data> dataAvroSerde = new SpecificAvroSerde<>();
        dataAvroSerde.configure(serdeConfig, false);
        return dataAvroSerde;
    }
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM