[英]Spark : Best way to Broadcast KafkaProducer to Spark streaming
To broadcast KafkaProducer to spark executors I have created a wrapper like below : 为了广播KafkaProducer以激发执行者,我创建了一个如下的包装器:
public class KafkaSink implements Serializable {
private static KafkaProducer<String, String> producer = null;
public KafkaProducer<String, String> getInstance(final Properties properties) {
if(producer == null) {
producer = new KafkaProducer<>(properties);
}
return producer;
}
public void close() {
producer.close();
}
}
and using it like below 并使用如下
JavaSparkContext jsc = new JavaSparkContext(sc);
Broadcast<KafkaSink> kafkaSinkBroadcast = jsc.broadcast(new KafkaSink()));
dataset.toJavaRDD().foreach(row -> kafkaSinkBroadcast.getValue().getInstance(kafkaProducerProps()).send(new ProducerRecord<String, String>(topic, row.mkString(", "))))
I just wanted to know whether its the right way to do it, or what is the best way to do it 我只是想知道它是正确的方法还是最好的方法是什么
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.