简体   繁体   English

用于模式注册表和代理的 Kafka 客户端消费者配置

[英]Kafka Client Consumer Config for Schema Registry and Broker

Following configs are in the Server Side:以下配置位于服务器端:

For broker: listener.security.protocol.map=EXTERNAL:SASL_SSL 
kafka.rest.client.security.protocol=SASL_PLAINTEXT

For Schemaregistry:
ssl.client.auth=true
ssl.enabled.protocols=TLSv1.2
ssl.key.password=${file:/mnt/sslcerts/jksPassword.txt:jksPassword}
ssl.keystore.location=/mnt/sslcerts/keystore.jks
ssl.keystore.password=${file:/mnt/sslcerts/jksPassword.txt:jksPassword}
ssl.truststore.location=/mnt/sslcerts/truststore.jks
ssl.truststore.password=${file:/mnt/sslcerts/jksPassword.txt:jksPassword}

Broker is SASL protected and Schema Registry is mTLS protected. Broker 受 SASL 保护,Schema Registry 受 mTLS 保护。

I have following consumer config in my Spring Boot app.我的 Spring Boot 应用程序中有以下消费者配置。

props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, KafkaJsonSchemaDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, KafkaJsonSchemaDeserializer.class);

props.put(CommonClientConfigs.SECURITY_PROTOCOL_CONFIG, "SASL_SSL");
props.put(SslConfigs.SSL_ENABLED_PROTOCOLS_CONFIG, "TLSv1.2");

props.put(SslConfigs.SSL_TRUSTSTORE_TYPE_CONFIG, "JKS");
props.put(SslConfigs.SSL_TRUSTSTORE_LOCATION_CONFIG, "./certs/truststore.jks");
props.put(SslConfigs.SSL_TRUSTSTORE_PASSWORD_CONFIG, "213fsfsK");

props.put(SslConfigs.SSL_KEYSTORE_TYPE_CONFIG, "JKS");
props.put(SslConfigs.SSL_KEYSTORE_LOCATION_CONFIG, "./certs/keystore.jks");
props.put(SslConfigs.SSL_KEYSTORE_PASSWORD_CONFIG, "213fsfsK")

props.put(SslConfigs.SSL_ENDPOINT_IDENTIFICATION_ALGORITHM_CONFIG, sslEndpointIdentificationAlgorithm);

props.put(SaslConfigs.SASL_MECHANISM, saslMechanism);
props.put(SaslConfigs.SASL_JAAS_CONFIG, saslJaasConfig);
props.put(SaslConfigs.SASL_JAAS_CONFIG,
            String.format("%s required username=\"%s\" " + "password=\"%s\";", PlainLoginModule.class.getName(), sasl_ssl_username, sasl_ssl_password));
props.put("schema.registry.url", "https://schemaregistry.confluent.apps:443");

I am trying to configure both the security protocols in the consumer.我正在尝试在消费者中配置这两种安全协议。 I am getting a bad certificate error and not able to consume the messages.我收到一个错误的证书错误,无法使用这些消息。 I just wanted to make sure I am using the right configurations so that I can be certain that the error is because of certificates.我只是想确保我使用了正确的配置,以便我可以确定错误是由于证书引起的。

I wasnt able to connect to schema registry using mTLS.我无法使用 mTLS 连接到模式注册表。 So, to avoid this, I changed the schema deserializers in the JDBCConnector so that I wont have to worry about Schema Registry at all.因此,为了避免这种情况,我更改了 JDBCConnector 中的模式反序列化器,这样我就不必担心模式注册表了。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 消费者如何在从 Kafka 代理获取数据之前检查模式注册表是否可访问? - How can consumer check if the schema registry is reachable before getting data from Kafka broker? 为什么需要创建一个Kafka使用者才能连接到Schema Registry? - Why do I need to create a Kafka Consumer to connect to Schema Registry? Spring 使用模式注册表启动 kafka - 负载在消费者端不匹配 - Spring boot kafka with schema registry - payload is not matching at consumer end 使用 Avro Schema 注册表的 Kafka 消费者单元测试失败 - Kafka consumer unit test with Avro Schema registry failing 提供了 kafka schema.registry.url 但不是已知的配置 - kafka schema.registry.url was supplied but isn't a known config Apache Kafka Consumer - Java 客户端与其他客户端之间的代理主机解析 - Apache Kafka Consumer - Broker host resolution between Java Client vs. rest of the Clients Spring 使用 Glue 模式注册表反序列化 AVRO GENERIC_RECORD 启动 kafka 消费者问题 - Spring boot kafka consumer issue deserializing AVRO GENERIC_RECORD using Glue schema registry Kafka Consumer-Java客户端 - Kafka Consumer - Java Client Java gradle kafka-avro-serializer 和 kafka-schema-registry-client 在部署管道中下载失败 - Java gradle kafka-avro-serializer and kafka-schema-registry-client fails to download in the deployment pipeline 使用Kafka实现架构注册表时出错 - Error implementing schema registry with kafka
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM