[英]Spring Cloud Stream Kafka with Confluent is not producing same message as Spring Kafka with Confluent
I would like to use Spring Cloud Stream Kafka for my Java/Spring service and I need to produce Confluent serialized messages as I have .NET and NodeJS clients that use Confluent APIs to consume my messages.我想将 Spring Cloud Stream Kafka 用于我的 Java/Spring 服务,我需要生成 Confluent 序列化消息,因为我有 .NET 和 NodeJS 客户端,它们使用 Confluent API 来使用我的消息。
So far as we can see Spring Kafka with the Confluent serializer is working for us while Spring Cloud Stream Kafka with the Confluent serializer is giving us problems.到目前为止,我们可以看到带有 Confluent 序列化器的 Spring Kafka 正在为我们工作,而带有 Confluent 序列化器的 Spring Cloud Stream Kafka 给我们带来了问题。
To demonstrate where I can see a difference in the 2 cases I have created 2 example repositories on GitHub containing only the code needed to produce a simple message in both cases.为了演示我可以在哪里看到两种情况的差异,我在 GitHub 上创建了 2 个示例存储库,其中仅包含在两种情况下生成简单消息所需的代码。
With Spring Kakfa and Confluent https://github.com/donalthurley/springKafkaAvro使用 Spring Kakfa 和 Confluent https://github.com/donalthurley/springKafkaAvro
With Spring Cloud Stream Kafka and Confluent https://github.com/donalthurley/springCloudKafkaAvro使用 Spring Cloud Stream Kafka 和 Confluent https://github.com/donalthurley/springCloudKafkaAvro
I think I have configured the config settings with the useNativeEncoding
flag and the confluent serializer configuration correctly for the Spring Cloud application and these can be seen in the source code here https://github.com/donalthurley/springCloudKafkaAvro/blob/master/src/main/resources/application.yaml#L8我想我已经为 Spring Cloud 应用程序正确配置了useNativeEncoding
标志和融合序列化器配置的配置设置,这些可以在此处的源代码中看到https://github.com/donalthurley/springCloudKafkaAvro/blob/master/src /main/resources/application.yaml#L8
kafka:
binder:
useNativeEncoding: true
brokers: 127.0.0.1:9092
bindings:
output:
producer:
configuration:
schema.registry.url: http://127.0.0.1:8081
key.serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
value.serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
I send the same simple message from my Spring Kafka application and from my Spring Cloud Stream Kafka application, the logs show.日志显示,我从 Spring Kafka 应用程序和 Spring Cloud Stream Kafka 应用程序发送了相同的简单消息。
Producing Kafka person event: {"lastName": "Doe", "firstName": "John"}
When I use the Kafka Topics UI browser from my docker Kafka environment, see https://hub.docker.com/r/landoop/fast-data-dev/ , and view the message raw data it is different in both cases.当我从我的 docker Kafka 环境使用 Kafka Topics UI 浏览器时,请参阅https://hub.docker.com/r/landoop/fast-data-dev/ ,并查看消息原始数据,这两种情况都不同。
It looks more correct for Spring Kafka as the browser recognises and displays the fields inside the message value. Spring Kafka 看起来更正确,因为浏览器识别并显示消息值中的字段。
[
{
"topic": "test_spring_kafka",
"key": "3197449393600061094",
"value": {
"lastName": "Doe",
"firstName": "John"
},
"partition": 0,
"offset": 0
}
]
In the Spring Cloud Stream Kafka raw data the browser fails to recognise the fields which shows that the messages are not the same.在 Spring Cloud Stream Kafka 原始数据中,浏览器无法识别显示消息不相同的字段。
[
{
"topic": "test_spring_cloud_kafka",
"key": "-6214497758709596999",
"value": "\u0006Doe\bJohn",
"partition": 0,
"offset": 0
}
]
I think there may be an issue producing the Confluent messages using Spring Cloud Stream Kafka and that the Spring Kafka implementation is producing them correctly but maybe I am missing something in my implementation and some one could help me with this problem?我认为使用 Spring Cloud Stream Kafka 生成 Confluent 消息可能存在问题,并且 Spring Kafka 实现正在正确生成它们,但也许我在我的实现中遗漏了一些东西,有人可以帮助我解决这个问题吗?
The problem is with the way you configure useNativeEncoding
.问题在于您配置useNativeEncoding
的方式。 It was not taking into effect.它没有生效。 This configuration should work:此配置应该有效:
spring:
application:
name: springCloudKafkaAvro
cloud:
stream:
schemaRegistryClient:
endpoint: http://127.0.0.1:8081
kafka:
binder:
brokers: 127.0.0.1:9092
bindings:
output:
producer:
configuration:
schema.registry.url: http://127.0.0.1:8081
key.serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
value.serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
bindings:
output:
destination: test_spring_cloud_kafka
producer:
useNativeEncoding: true
Notice how useNativeEncoding
is rearranged from your original config.请注意useNativeEncoding
如何从原始配置重新排列。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.