[英]spring-cloud-stream-binder-kafka configuration for Confluent Cloud Schema Registry Unauthorized error
I'm having trouble configuring a connection to Confluent when using spring-cloud-stream-binder-kafka
.使用
spring-cloud-stream-binder-kafka
时,我无法配置与 Confluent 的连接。 Possibly somebody can see what is wrong?可能有人可以看到什么是错的?
When I use the example from https://www.confluent.io/blog/schema-registry-avro-in-spring-boot-application-tutorial/ Then it works fine and I can see messages on Confluent Cloud当我使用https://www.confluent.io/blog/schema-registry-avro-in-spring-boot-application-tutorial/中的示例时,它工作正常,我可以在 Confluent Cloud 上看到消息
However, when adding the same connection details using the spring-cloud-stream-binder-kafka
config, it is returning Unauthorized error.但是,当使用
spring-cloud-stream-binder-kafka
配置添加相同的连接详细信息时,它会返回 Unauthorized 错误。
Caused by: org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: {"type":"record","name":"MySchema","namespace":"org.test","fields":[{"name":"value","type":"double"}]}
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unauthorized; error code: 401
My Configuration below gives the above error.我的下面的配置给出了上述错误。 Not sure what is going wrong?
不知道出了什么问题?
cloud:
stream:
default:
producer:
useNativeEncoding: true
kafka:
binder:
brokers: myinstance.us-east1.gcp.confluent.cloud:9092
producer-properties:
key.serializer: org.apache.kafka.common.serialization.StringSerializer
value.serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
schema.registry.url: https://myinstance.us-central1.gcp.confluent.cloud
basic.auth.credentials.source: USER_INFO
schema.registry.basic.auth.user.info: mySchemaKey:mySchemaSecret
configuration:
ssl.endpoint.identification.algorithm: https
sasl.mechanism: PLAIN
request.timeout.ms: 20000
retry.backoff.ms: 500
sasl.jaas.config: org.apache.kafka.common.security.plain.PlainLoginModule required username="myKey" password="MySecret";
security.protocol: SASL_SSL
bindings:
normals-out:
destination: normals
contentType: application/*+avro
Example from Confluent that is working fine: Confluent 中运行良好的示例:
kafka:
bootstrap-servers:
- myinstance.us-east1.gcp.confluent.cloud:9092
properties:
ssl.endpoint.identification.algorithm: https
sasl.mechanism: PLAIN
request.timeout.ms: 20000
retry.backoff.ms: 500
sasl.jaas.config: org.apache.kafka.common.security.plain.PlainLoginModule required username="myKey" password="MySecret";
security.protocol: SASL_SSL
schema.registry.url: https://myinstance.us-central1.gcp.confluent.cloud
basic.auth.credentials.source: USER_INFO
schema.registry.basic.auth.user.info: mySchemaKey:mySchemaSecret
producer:
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
template:
default-topic:
logging:
level:
root: info
My issue was only that I was missing a dependency in my pom.我的问题只是我在我的 pom.xml 中缺少一个依赖项。
I should delete my question, but I leave it here as a reference that the configuration does actually work as it is above.我应该删除我的问题,但我把它留在这里作为配置确实像上面那样工作的参考。
<dependency>
<groupId>io.confluent</groupId>
<artifactId>kafka-schema-registry-client</artifactId>
<version>5.3.0</version>
</dependency>
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.