简体   繁体   English

Confluent Cloud Schema Registry Unauthorized错误的spring-cloud-stream-binder-kafka配置

[英]spring-cloud-stream-binder-kafka configuration for Confluent Cloud Schema Registry Unauthorized error

I'm having trouble configuring a connection to Confluent when using spring-cloud-stream-binder-kafka .使用spring-cloud-stream-binder-kafka时,我无法配置与 Confluent 的连接。 Possibly somebody can see what is wrong?可能有人可以看到什么是错的?

When I use the example from https://www.confluent.io/blog/schema-registry-avro-in-spring-boot-application-tutorial/ Then it works fine and I can see messages on Confluent Cloud当我使用https://www.confluent.io/blog/schema-registry-avro-in-spring-boot-application-tutorial/中的示例时,它工作正常,我可以在 Confluent Cloud 上看到消息

However, when adding the same connection details using the spring-cloud-stream-binder-kafka config, it is returning Unauthorized error.但是,当使用spring-cloud-stream-binder-kafka配置添加相同的连接详细信息时,它会返回 Unauthorized 错误。

Caused by: org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: {"type":"record","name":"MySchema","namespace":"org.test","fields":[{"name":"value","type":"double"}]}

Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unauthorized; error code: 401

My Configuration below gives the above error.我的下面的配置给出了上述错误。 Not sure what is going wrong?不知道出了什么问题?

  cloud:
    stream:
      default:
        producer:
          useNativeEncoding: true
      kafka:
        binder:
          brokers: myinstance.us-east1.gcp.confluent.cloud:9092
          producer-properties:
            key.serializer: org.apache.kafka.common.serialization.StringSerializer
            value.serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
            schema.registry.url: https://myinstance.us-central1.gcp.confluent.cloud
            basic.auth.credentials.source: USER_INFO
            schema.registry.basic.auth.user.info: mySchemaKey:mySchemaSecret
          configuration:
            ssl.endpoint.identification.algorithm: https
            sasl.mechanism: PLAIN
            request.timeout.ms: 20000
            retry.backoff.ms: 500
            sasl.jaas.config: org.apache.kafka.common.security.plain.PlainLoginModule required username="myKey" password="MySecret";
            security.protocol: SASL_SSL
      bindings:
        normals-out:
          destination: normals
          contentType: application/*+avro

Example from Confluent that is working fine: Confluent 中运行良好的示例:

  kafka:
    bootstrap-servers:
      - myinstance.us-east1.gcp.confluent.cloud:9092
    properties:
      ssl.endpoint.identification.algorithm: https
      sasl.mechanism: PLAIN
      request.timeout.ms: 20000
      retry.backoff.ms: 500
      sasl.jaas.config: org.apache.kafka.common.security.plain.PlainLoginModule required username="myKey" password="MySecret";
      security.protocol: SASL_SSL
      schema.registry.url: https://myinstance.us-central1.gcp.confluent.cloud
      basic.auth.credentials.source: USER_INFO
      schema.registry.basic.auth.user.info: mySchemaKey:mySchemaSecret
    producer:
      key-serializer: org.apache.kafka.common.serialization.StringSerializer
      value-serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
    template:
      default-topic:
logging:
  level:
    root: info

My issue was only that I was missing a dependency in my pom.我的问题只是我在我的 pom.xml 中缺少一个依赖项。

I should delete my question, but I leave it here as a reference that the configuration does actually work as it is above.我应该删除我的问题,但我把它留在这里作为配置确实像上面那样工作的参考。

<dependency>
  <groupId>io.confluent</groupId>
  <artifactId>kafka-schema-registry-client</artifactId>
  <version>5.3.0</version>
</dependency>

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 是否有任何方法可以在 maxAttempts 时间内使用 Acknowledge.nack 进行 spring-cloud-stream-binder-kafka 重试? - Is there any method to retry within maxAttempts times using Acknowledge.nack for spring-cloud-stream-binder-kafka? Spring Cloud Stream Kafka Binder 运行时配置更新 - Spring Cloud Stream Kafka Binder Configuration update at runtime 如何处理 Spring 云 stream kafka 流活页夹中的序列化错误? - How to handle Serialization error in Spring cloud stream kafka streams binder? 空指针:带有活页夹 kafka 的 Spring Cloud 流 - Null pointer: Spring cloud stream with binder kafka Spring Cloud Stream Kafka活页夹压缩 - Spring Cloud Stream Kafka binder compression Spring Cloud Stream自定义活页夹未注册。 如果使用@Configuration,则禁用kafka绑定程序 - Spring cloud stream custom binder not registered. Disables the kafka binder if used @Configuration 无法使用Confluent Schema Registry和Spring Cloud Streams反序列化Avro消息 - Unable to deserialize Avro message using Confluent Schema Registry and Spring Cloud Stream Kafka中的JSON错误-Spring Cloud Stream Kafka Binder - Bad JSON in Kafka - Spring Cloud Stream Kafka Binder 使用 Spring Cloud Stream Kafka Binder 时 Kafka Producer 中的错误处理 - Error Handling in Kafka Producer while using Spring Cloud Stream Kafka Binder Spring Cloud Stream rabbitmq binder - Spring Cloud 函数错误处理 - Spring cloud stream rabbitmq binder - spring cloud function error handling
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM