[英]Exception in thread “main” org.apache.spark.SparkException: Task not serializable"
iam getting the above error while running the below code. 我运行以下代码时遇到上述错误。 i observed that there is some serializable problem but i cudn't trace out exactly. 我观察到有一些可序列化的问题,但我无法准确地找到。 can any one explain what can i do here. 谁能解释一下我该怎么办。 thanks in advance. 提前致谢。
enter code here
def checkforType(json:String):String={
val parsedjson = parse(json)
val res=(parsedjson \\ "Head" \\ "Type" ).extract[String]
(res)
}
val dstream = KafkaUtils.createStream(ssc, zkQuorum, group, Map("topic" -> 1)).map(_._2)
val pType = dstream.map(checkforType)
pType.map(rdd => {
val pkt= rdd.toString()
if(pkt.equals("P300")) {
val t300=dstream.map(par300)
t300.print()
}else if(pkt.equals("P30")) {
val t30=dstream.map(par30)
t30.print()
}else if(pkt.equals("P6")) {
val t6=dstream.map(par6)
t6.print()
}
})
This mainly happens when you are passing the object to the transformation and that object is not serializable. 这主要发生在将对象传递给转换并且该对象不可序列化时。
I found one interesting post on this : https://jaceklaskowski.gitbooks.io/mastering-apache-spark/content/spark-tips-and-tricks-sparkexception-task-not-serializable.html 我在此找到了一个有趣的帖子: https : //jaceklaskowski.gitbooks.io/mastering-apache-spark/content/spark-tips-and-tricks-sparkexception-task-not-serializable.html
Maybe this can solve your problem. 也许这可以解决您的问题。 Thanks ! 谢谢 !
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.