简体   繁体   English

Spark:将KafkaProducer广播到Spark流媒体的最佳方法

[英]Spark : Best way to Broadcast KafkaProducer to Spark streaming

To broadcast KafkaProducer to spark executors I have created a wrapper like below : 为了广播KafkaProducer以激发执行者,我创建了一个如下的包装器:

public class KafkaSink implements Serializable {
    private static KafkaProducer<String, String> producer = null;

    public KafkaProducer<String, String> getInstance(final Properties properties) {
        if(producer == null) {
            producer = new KafkaProducer<>(properties);
        }
        return producer;
    }

    public void close() {
        producer.close();
    }
}

and using it like below 并使用如下

 JavaSparkContext jsc = new JavaSparkContext(sc);
 Broadcast<KafkaSink> kafkaSinkBroadcast = jsc.broadcast(new KafkaSink()));
 dataset.toJavaRDD().foreach(row -> kafkaSinkBroadcast.getValue().getInstance(kafkaProducerProps()).send(new ProducerRecord<String, String>(topic, row.mkString(", "))))

I just wanted to know whether its the right way to do it, or what is the best way to do it 我只是想知道它是正确的方法还是最好的方法是什么

I can really recommend this blog post . 我真的可以推荐此博客文章 In short, you should create a serializable sink for each partition by passing a 'recipe' to create Kafka producer. 简而言之,您应该通过传递“配方”来创建Kafka生产者,为每个分区创建一个可序列化的接收器。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM