繁体   English   中英

如何在Java中使用sasl机制PLAIN和安全协议SASL_SSL配置kafka使用者?

[英]How to configure kafka consumer with sasl mechanism PLAIN and with security protocol SASL_SSL in java?

我想创建使用安全协议SASL_SSL和sasl merchanism PLAIN的kafka使用者。 有人可以帮我配置这些详细信息吗?

我已经阅读了许多有关如何配置SASL详细信息的文档,但是仍然没有清楚地了解如何执行此操作。 我在这里附上我用来创建kafka消费者的代码

Properties props = new Properties();
props.put("bootstrap.servers", "servers");
String consumeGroup = "consumer_group";
props.put("sasl.jaas.config", "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"username\" password=\"password\"");
props.put("group.id", consumeGroup);
props.put("client.id", "client_id");
props.put("security.protocol", "SASL_SSL");
props.put("enable.auto.commit", "true");
props.put("auto.commit.interval.ms", "101");
props.put("max.partition.fetch.bytes", "135");
// props.put("auto.offset.reset", "earliest");
props.put("heartbeat.interval.ms", "3000");
props.put("session.timeout.ms", "6001");
props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("value.deserializer",      "org.apache.kafka.common.serialization.StringDeserializer");
return new KafkaConsumer<String, String>(props);

堆栈跟踪

    14:56:12.767 [main] DEBUG o.a.k.clients.consumer.KafkaConsumer - Starting the Kafka consumer
    14:56:12.776 [main] DEBUG org.apache.kafka.clients.Metadata - Updated cluster metadata version 1 to Cluster(nodes = [Node(-2, kafka-events-nonprod-ds1.i, 9092), Node(-3, kafka-events-nonprod-ds1-3.io, 9092), Node(-1, kafka-events-nonprod-ds1-1.io, 9092)], partitions = [])
    14:56:12.789 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name connections-closed:client-id-client_id
    14:56:12.845 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name connections-created:client-id-client_id
    14:56:12.846 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name bytes-sent-received:client-id-client_id
    14:56:12.846 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name bytes-sent:client-id-client_id
    14:56:12.847 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name bytes-received:client-id-client_id
    14:56:12.847 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name select-time:client-id-client_id
    14:56:12.847 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name io-time:client-id-client_id
    14:56:12.861 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name heartbeat-latency
    14:56:12.862 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name join-latency
    14:56:12.862 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name sync-latency
    14:56:12.865 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name commit-latency
    14:56:12.873 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name bytes-fetched
    14:56:12.874 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name records-fetched
    14:56:12.879 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name fetch-latency
    14:56:12.881 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name records-lag
    14:56:12.882 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name fetch-throttle-time
    14:56:12.883 [main] WARN  o.a.k.c.consumer.ConsumerConfig - The configuration sasl.jaas.config = org.apache.kafka.common.security.plain.PlainLoginModule required username="username" password="password" was supplied but isn't a known config.
    14:56:12.885 [main] INFO  o.a.kafka.common.utils.AppInfoParser - Kafka version : 0.9.0.0
    14:56:12.885 [main] INFO  o.a.kafka.common.utils.AppInfoParser - Kafka commitId : fc7243c2af4b2b4a
    14:56:12.886 [main] DEBUG o.a.k.clients.consumer.KafkaConsumer - Kafka consumer created
    14:56:12.887 [main] DEBUG o.a.k.clients.consumer.KafkaConsumer - Subscribed to topic(s): topic_name
    14:56:12.887 [main] DEBUG o.a.k.c.c.i.AbstractCoordinator - Issuing group metadata request to broker -2
    14:56:12.918 [main] DEBUG o.apache.kafka.clients.NetworkClient - Initiating connection to node -2 at kafka-events-nonprod-ds1.i:9092.
    14:56:13.336 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name node--2.bytes-sent
    14:56:13.336 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name node--2.bytes-received
    14:56:13.337 [main] DEBUG o.a.kafka.common.metrics.Metrics - Added sensor with name node--2.latency
    14:56:13.339 [main] DEBUG o.apache.kafka.clients.NetworkClient - Completed connection to node -2
    14:56:13.343 [main] DEBUG o.apache.kafka.clients.NetworkClient - Sending metadata request ClientRequest(expectResponse=true, callback=null, request=RequestSend(header={api_key=3,api_version=0,correlation_id=1,client_id=client_id}, body={topics=[topic_name]}), isInitiatedByNetworkClient, createdTimeMs=1568193973342, sendTimeMs=0) to node -2
    14:56:13.986 [main] DEBUG o.a.kafka.common.network.Selector - Connection with kafka-events-nonprod-ds1-2.octanner.i/10.84.20.85 disconnected
    java.io.EOFException: null
        at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:99) ~[kafka-clients-0.9.0.0.jar:na]
        at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:71) ~[kafka-clients-0.9.0.0.jar:na]
        at org.apache.kafka.common.network.KafkaChannel.receive(KafkaChannel.java:160) ~[kafka-clients-0.9.0.0.jar:na]
        at org.apache.kafka.common.network.KafkaChannel.read(KafkaChannel.java:141) ~[kafka-clients-0.9.0.0.jar:na]
        at org.apache.kafka.common.network.Selector.poll(Selector.java:286) ~[kafka-clients-0.9.0.0.jar:na]
        at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:270) [kafka-clients-0.9.0.0.jar:na]
        at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.clientPoll(ConsumerNetworkClient.java:303) [kafka-clients-0.9.0.0.jar:na]
        at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:197) [kafka-clients-0.9.0.0.jar:na]
        at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:187) [kafka-clients-0.9.0.0.jar:na]
        at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.awaitMetadataUpdate(ConsumerNetworkClient.java:126) [kafka-clients-0.9.0.0.jar:na]
        at org.apache.kafka.clients.consumer.internals.AbstractCoordinator.ensureCoordinatorKnown(AbstractCoordinator.java:186) [kafka-clients-0.9.0.0.jar:na]
        at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce(KafkaConsumer.java:857) [kafka-clients-0.9.0.0.jar:na]
        at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:829) [kafka-clients-0.9.0.0.jar:na]
        at kafka.Consumer.processRecords(Consumer.java:54) [classes/:na]
        at kafka.Consumer.execute(Consumer.java:22) [classes/:na]
        at kafka.Consumer.main(Consumer.java:15) [classes/:na]

反序列化功能:

private static void processRecords(KafkaConsumer<String, Object> consumer) throws InterruptedException {
    while (true) {
        ConsumerRecords<String, Object> records = consumer.poll(TimeUnit.MINUTES.toMillis(1));
        long lastOffset = 0;
        for (ConsumerRecord<String, Object> record : records) {
            System.out.printf("\n\n\n\n\n\n\roffset = %d, key = %s\n\n\n\n\n\n", record.offset(), record.value());
            lastOffset = record.offset();
        }
        System.out.println("lastOffset read: " + lastOffset);
        process();
    }
}

在Kafka 0.10中添加了对Plain机制的支持。 您所使用的版本Kafka 0.9仅支持GSSAPI机制。

切换到最新版本后,只需至少设置以下配置:

Properties props = new Properties();
props.put(CommonClientConfigs.BOOTSTRAP_SERVERS_CONFIG, <BROKERS>);
props.put(CommonClientConfigs.SECURITY_PROTOCOL_CONFIG, "SASL_SSL");
props.put(SaslConfigs.SASL_MECHANISM, "PLAIN");
props.put(SaslConfigs.SASL_JAAS_CONFIG, "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"" + username + "\" password=\"" + password + "\";");

请注意,Kafka 0.10.2中添加了SaslConfigs.SASL_JAAS_CONFIG支持。 在此之前,您需要使用JAAS文件。 有关详细信息,请参见Kafka“未在JAAS config中指定登录模块”

我建议您尽可能开始使用最新的Kafka版本。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM