简体   繁体   English

spark kafka 生产者可序列化

[英]spark kafka producer serializable

I come up with the exception:我想出了一个例外:

ERROR yarn.ApplicationMaster: User class threw exception: org.apache.spark.SparkException: Task not serializable org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:304) at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:294) at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:122) at org.apache.spark.SparkContext.clean(SparkContext.scala:2032) at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:889) at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:888) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108) at org.apache.spark.rdd.RDD.withScope(RDD.scala:306) at org.apache.spark.rdd.RDD.foreach(RDD.scala:888) at com.Boot$.test(Boot.scala:60) at com.Boot$.main(Boot.scala:36) at com.Boot.main(Boot.scala)错误 yarn.ApplicationMaster:用户类抛出异常:org.apache.spark.SparkException:任务不可序列化 org.apache.spark.SparkException:任务不可序列化在 org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala: 304) 在 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:294) 在 org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala :122) 在 org.apache.spark.SparkContext.clean(SparkContext.scala:2032) 在 org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:889) 在 org.apache。 spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:888) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147) at org.apache.spark.rdd.RDDOperationScope $.withScope(RDDOOperationScope.scala:108) at org.apache.spark.rdd.RDD.withScope(RDD.scala:306) at org.apache.spark.rdd.RDD.foreach(RDD.scala:888) at com .Boot$.test(Boot.scala:60) 在 com.Boot$.main(Boot.scala:36) 在 com.Boot.main(Boot.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:525) Caused by: java.io.NotSerializableException: org.apache.kafka.clients.producer.KafkaProducer Serialization stack: - object not serializable (class: org.apache.kafka.clients.producer.KafkaProducer, value: org.apache.kafka.clients.producer.KafkaProducer@77624599) - field (class: com.Boot$$anonfun$test$1, name: producer$1, type: class org.apache.kafka.clients.producer.KafkaProducer) - object (class com.Boot$$anonfun$test$1, ) at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40) at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47) a在 sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 在 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 在 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 在 java.lang.reflect.Method .invoke(Method.java:606) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:525) 引起:java.io.NotSerializableException:org.apache.kafka.clients .producer.KafkaProducer 序列化堆栈:- 对象不可序列化(类:org.apache.kafka.clients.producer.KafkaProducer,值:org.apache.kafka.clients.producer.KafkaProducer@77624599)- 字段(类:com.Boot $$anonfun$test$1, name: producer$1, type: class org.apache.kafka.clients.producer.KafkaProducer) - 对象(类 com.Boot$$anonfun$test$1, )在 org.apache.spark.serializer .SerializationDebugger$.improveException(SerializationDebugger.scala:40) 在 org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47) a t org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:84) at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:301) t org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:84) 在 org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:301)

//    @transient
val sparkConf = new SparkConf()

sparkConf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")

//    @transient
val sc = new SparkContext(sparkConf)

val requestSet: RDD[String] = sc.textFile(s"hdfs:/user/bigdata/ADVERTISE-IMPRESSION-STAT*/*")

//    @transient
val props = new HashMap[String, Object]()
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, NearLineConfig.kafka_brokers)
//    props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.ByteArraySerializer");
//    props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.ByteArraySerializer");
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer")
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer")
props.put("producer.type", "async")
props.put(ProducerConfig.BATCH_SIZE_CONFIG, "49152")

//    @transient
val producer: KafkaProducer[String, String] = new KafkaProducer[String, String](props)

requestSet.foreachPartition((partisions: Iterator[String]) => {
  partisions.foreach((line: String) => {
    try {
      producer.send(new ProducerRecord[String, String]("testtopic", line))
    } catch {
      case ex: Exception => {
        log.warn(ex.getMessage, ex)
      }
    }
  })
})

producer.close()

In this program i try to read the records from the hdfs path and save them into kafka.在这个程序中,我尝试从 hdfs 路径读取记录并将它们保存到 kafka 中。 the problem is when I remove the codes about sending records to kafka , it runs well.问题是当我删除有关向 kafka 发送记录的代码时,它运行良好。 What I missed ?我错过了什么?

KafkaProducer isn't serializable. KafkaProducer不可序列化。 You'll need to move the creation of the instance to inside foreachPartition :您需要将实例的创建移动到foreachPartition内部:

requestSet.foreachPartition((partitions: Iterator[String]) => {
  val producer: KafkaProducer[String, String] = new KafkaProducer[String, String](props)
  partitions.foreach((line: String) => {
    try {
      producer.send(new ProducerRecord[String, String]("testtopic", line))
    } catch {
      case ex: Exception => {
        log.warn(ex.getMessage, ex)
      }
    }
  })
})

Note that KafkaProducer.send returns a Future[RecordMetadata] , and the only exception that can propagate from it is SerializationException if the key or value can't be serialized.请注意, KafkaProducer.send返回Future[RecordMetadata] ,如果键或值无法序列化,则唯一可以从中传播的异常是SerializationException

我不推荐 Yuval Itzchakov 的答案,因为你打开和关闭了很多套接字,甚至用 kafka 在代理中打开一个连接很重而且很慢,所以我强烈建议阅读这个博客https://allegro.tech/2015/08 /spark-kafka-integration.html我使用它并对其进行了测试,它也是我在生产环境中使用的最佳选择。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM