简体   繁体   English

默认情况下如何使用Kafka Spring Cloud Stream并使用汇合API生成的Kafka消息?

[英]How to consume from Kafka Spring Cloud Stream by default and also consume a Kafka message generated by the confluent API?

I am building a microservice component which will consume by default Spring Cloud Stream (SCS) Kafka messages generated by other (SCS) components. 我正在构建一个微服务组件,它将默认使用由其他(SCS)组件生成的Spring Cloud Stream(SCS)Kafka消息。

But I also have a requirement to consume Kafka messages from other components that are using the confluent API. 但我还要求使用来自使用汇合API的其他组件的Kafka消息。

I have an example repository that shows what I'm trying to do. 我有一个示例存储库,显示我正在尝试做的事情。

https://github.com/donalthurley/KafkaConsumeScsAndConfluent https://github.com/donalthurley/KafkaConsumeScsAndConfluent

This is the application configuration below with the SCS input binding and the confluent input binding. 这是下面的应用程序配置,带有SCS输入绑定和汇合输入绑定。

spring:
  application:
    name: kafka
  kafka:
    consumer:
      properties.schema.registry.url: http://192.168.99.100:8081
  cloud:
    stream:
      kafka:
        binder:
          brokers: PLAINTEXT://192.168.99.100:9092
#          configuration:
#            specific:
#              avro:
#                reader: true
#            key:
#              deserializer: org.apache.kafka.common.serialization.StringDeserializer
#            value:
#              deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer

      bindings:
        inputConfluent:
          contentType: application/*+avro
          destination: confluent-destination
          group: input-confluent-group
        inputScs:
          contentType: application/*+avro
          destination: scs-destination
          group: input-scs-group

With the above configuration I get both consumers created with the SCS default configuration For instance the class org.apache.kafka.common.serialization.ByteArrayDeserializer is the value deserializer for both input bindings. 使用上面的配置,我将使用SCS默认配置创建两个使用者。例如,类org.apache.kafka.common.serialization.ByteArrayDeserializer是两个输入绑定的值反序列化器。

If I remove the comments in the above configuration I get both consumers with the configuration being sent from my Confluent client For instance the class io.confluent.kafka.serializers.KafkaAvroDeserializer is the value deserializer for both input bindings. 如果我删除上述配置中的注释,我会从配置客户端发送配置的两个消费者。例如,类io.confluent.kafka.serializers.KafkaAvroDeserializer是两个输入绑定的值反序列化器。

I understand because the configuration is on the Kafka binder it will apply to all the consumers defined with that binder. 我理解,因为配置在Kafka活页夹上,它将应用于使用该活页夹定义的所有消费者。

Is there any way that I can define those specific properties so that they will apply for only the confluent specific consumer binding and all the other input binding can use the default SCS config? 有没有什么方法可以定义这些特定的属性,以便它们只应用于汇合的特定使用者绑定,所有其他输入绑定可以使用默认的SCS配置?

You can set binding-specific consumer and producer properties via the configuration property. 您可以通过configuration属性设置特定于绑定的使用者和生产者属性。

See the reference manual . 请参阅参考手册

spring.cloud.stream.kafka.bindings.<channelName>.consumer.configuration.foo.bar=baz

When using non-standard serializers/deserializers you must set useNativeEncoding and useNativeDecoding for producers and consumers respectively. 使用非标准序列化器/反序列化器时,必须分别为生产者和使用者设置useNativeEncodinguseNativeDecoding Again, see the reference manual. 再次参见参考手册。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何使用 spring 云 stream binder kafka 流依赖项使用协议缓冲区(protobuf)来使用来自 kafka 主题的消息? - How to consume messages from kafka topics using protocol buffers (protobuf) by spring cloud stream binder kafka streams dependency? 带有 Confluent 的 Spring Cloud Stream Kafka 与带有 Confluent 的 Spring Kafka 产生的消息不同 - Spring Cloud Stream Kafka with Confluent is not producing same message as Spring Kafka with Confluent 如何使用 kafka-python 生成的 kafka 消息? - How to consume message from kafka which was produced by kafka-python? 如何使用一些过滤器使用来自 Kafka 主题的消息? - How to consume message from Kafka topic using some filter? 如何使用 spring webflux 从 Kafka 主题中持续消费? - How to continually consume from Kafka topic using spring webflux? 如何在 Spring Boot 中使用来自 Apache Kafka 的数据 - How to consume data from Apache Kafka in Spring Boot Apache Kafka不会从api中使用 - Apache Kafka does not consume from api Spring-cloud-stream流Kafka使用者不会使用服务关闭时发送的消息吗? - Will spring-cloud-stream Kafka consumer not consume messages that were sent when the service was down? 使用自定义标头使用 Kafka 消息 - Consume Kafka message with a custom header 如何在春季卡夫卡消费者消费之前过滤卡夫卡消息 - How to filter Kafka messages before consumer consume in spring Kafka
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM