I am trying to write to write to Confluent Cloud/Kafka from Dataflow (Apache Beam), using the following:
kafkaKnowledgeGraphKVRecords.apply("Write to Kafka", KafkaIO.<String, String>write()
.withBootstrapServers("<mybootstrapserver>.confluent.cloud:9092")
.withTopic("testtopic").withKeySerializer(StringSerializer.class)
.withProducerConfigUpdates(props).withValueSerializer(StringSerializer.class));
where Map<String, Object> props = new HashMap<>();
(ie empty for now)
In the logs, I get: send failed : 'Topic testtopic not present in metadata after 60000 ms.'
The topic does exist on this cluster - so my guess is that there is an issue with login, which makes sense as I couldn't find a way to pass the APIKey.
I did try various combinations to pass the APIKey/Secret I have from Confluent Cloud to auth with the props
above but I couldn't find a working setup.
Found a solution, thanks to the pointers in the comments of @RobinMoffatt below the question
Here's the setup I have now:
Map<String, Object> props = new HashMap<>()
props.put("ssl.endpoint.identification.algorithm", "https");
props.put("sasl.mechanism", "PLAIN");
props.put("request.timeout.ms", 20000);
props.put("retry.backoff.ms", 500);
props.put("sasl.jaas.config","org.apache.kafka.common.security.plain.PlainLoginModule required username=\"<APIKEY>\" password=\"<SECRET>\";");
props.put("security.protocol", "SASL_SSL");
kafkaKnowledgeGraphKVRecords.apply("Write to Kafka-TESTTOPIC", KafkaIO.<String, String>write()
.withBootstrapServers("<CLUSTER>.confluent.cloud:9092")
.withTopic("test").withKeySerializer(StringSerializer.class)
.withProducerConfigUpdates(props).withValueSerializer(StringSerializer.class));
The key line I had wrong is the sasl.jaas.config
(note the ;
at the end!)
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.