简体   繁体   中英

Spark Streaming - write to Kafka topic

Can anyone please help me on this. I have issue with the performance when using the below code to publish messages to kafka

 message.foreachPartition{ part =>
  val producer = new KafkaProducer[String, String](props)
  part.foreach{ msg =>
    val message = new ProducerRecord[String, String](topic, msg._1, msg._2)
    producer.send(message)
  }
  producer.close()

}

So I used the post to optimise the performance. Below is the code that I have written in my code.

val kafkaSink = sparkContext.broadcast(KafkaSink(kafkaProps))

resultRDD.foreach{message =>
    kafkaSink.value.send(outputTopic, message._1, message._2)
   }


class KafkaSink(createProducer: () => KafkaProducer[String, String]) extends Serializable {
  lazy val producer = createProducer()
  def send(topic: String, key:String, value: String): Unit = 
  producer.send(new ProducerRecord(topic, key, value))
}

object KafkaSink {
  def apply(config: Map[String, Object]): KafkaSink = {
  val f = () => {
  val producer = new KafkaProducer[String, String](config.asJava)
  sys.addShutdownHook {
                        producer.close()
                      }
  producer
}
new KafkaSink(f)
}}

But the program is getting stuck and not even a single message is published to kafka. I have checked the logs and I could only find the below info in the yarn logs file.

producer.KafkaProducer: Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms

Could you please let me know what am I missing. Spark version is 1.6.0. Currently time for publishing the messages it taking around 8 sec for a 20 sec batch interval of around 300K messages.

Thanks in advance.

Since there is no direct way of writing the messages to Kafka from Spark Streaming(version < 2.2), I would try something like this with ForeachWriter

Create a KafkaSinkWritter

import java.util.Properties
import org.apache.kafka.clients.producer._
import org.apache.spark.sql.ForeachWriter


 class  KafkaSink(topic:String, servers:String) extends ForeachWriter[(String, String)] {
      val kafkaProperties = new Properties()
      kafkaProperties.put("bootstrap.servers", servers)
      kafkaProperties.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer")
      kafkaProperties.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer")
      val results = new scala.collection.mutable.HashMap[String, String]
      var producer: KafkaProducer[String, String] = _

      def open(partitionId: Long,version: Long): Boolean = {
        producer = new KafkaProducer(kafkaProperties)
        true
      }

      def process(value: (String, String)): Unit = {
          producer.send(new ProducerRecord(topic, value._1 + ":" + value._2))
      }

      def close(errorOrNull: Throwable): Unit = {
        producer.close()
      }
   }

Write messages using SinkWriter

val topic = "<topic2>"
val brokers = "<server:ip>"

val writer = new KafkaSink(topic, brokers)

val query =
  streamingSelectDF
    .writeStream
    .foreach(writer)
    .outputMode("update")
    .start()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM