[英]How to consume from Kafka Spring Cloud Stream by default and also consume a Kafka message generated by the confluent API?
I am building a microservice component which will consume by default Spring Cloud Stream (SCS) Kafka messages generated by other (SCS) components. 我正在构建一个微服务组件,它将默认使用由其他(SCS)组件生成的Spring Cloud Stream(SCS)Kafka消息。
But I also have a requirement to consume Kafka messages from other components that are using the confluent API. 但我还要求使用来自使用汇合API的其他组件的Kafka消息。
I have an example repository that shows what I'm trying to do. 我有一个示例存储库,显示我正在尝试做的事情。
https://github.com/donalthurley/KafkaConsumeScsAndConfluent https://github.com/donalthurley/KafkaConsumeScsAndConfluent
This is the application configuration below with the SCS input binding and the confluent input binding. 这是下面的应用程序配置,带有SCS输入绑定和汇合输入绑定。
spring:
application:
name: kafka
kafka:
consumer:
properties.schema.registry.url: http://192.168.99.100:8081
cloud:
stream:
kafka:
binder:
brokers: PLAINTEXT://192.168.99.100:9092
# configuration:
# specific:
# avro:
# reader: true
# key:
# deserializer: org.apache.kafka.common.serialization.StringDeserializer
# value:
# deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
bindings:
inputConfluent:
contentType: application/*+avro
destination: confluent-destination
group: input-confluent-group
inputScs:
contentType: application/*+avro
destination: scs-destination
group: input-scs-group
With the above configuration I get both consumers created with the SCS default configuration For instance the class org.apache.kafka.common.serialization.ByteArrayDeserializer is the value deserializer for both input bindings. 使用上面的配置,我将使用SCS默认配置创建两个使用者。例如,类org.apache.kafka.common.serialization.ByteArrayDeserializer是两个输入绑定的值反序列化器。
If I remove the comments in the above configuration I get both consumers with the configuration being sent from my Confluent client For instance the class io.confluent.kafka.serializers.KafkaAvroDeserializer is the value deserializer for both input bindings. 如果我删除上述配置中的注释,我会从配置客户端发送配置的两个消费者。例如,类io.confluent.kafka.serializers.KafkaAvroDeserializer是两个输入绑定的值反序列化器。
I understand because the configuration is on the Kafka binder it will apply to all the consumers defined with that binder. 我理解,因为配置在Kafka活页夹上,它将应用于使用该活页夹定义的所有消费者。
Is there any way that I can define those specific properties so that they will apply for only the confluent specific consumer binding and all the other input binding can use the default SCS config? 有没有什么方法可以定义这些特定的属性,以便它们只应用于汇合的特定使用者绑定,所有其他输入绑定可以使用默认的SCS配置?
You can set binding-specific consumer and producer properties via the configuration
property. 您可以通过configuration
属性设置特定于绑定的使用者和生产者属性。
See the reference manual . 请参阅参考手册 。
spring.cloud.stream.kafka.bindings.<channelName>.consumer.configuration.foo.bar=baz
When using non-standard serializers/deserializers you must set useNativeEncoding
and useNativeDecoding
for producers and consumers respectively. 使用非标准序列化器/反序列化器时,必须分别为生产者和使用者设置useNativeEncoding
和useNativeDecoding
。 Again, see the reference manual. 再次参见参考手册。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.