繁体   English   中英

使用IgniteRDD时出现“任务无法序列化”异常

[英]Task Not Serializable exception when using IgniteRDD

此代码有什么问题?? 我无法摆脱无法序列化的任务

@throws(classOf[Exception])
override def setUp(cfg: BenchmarkConfiguration) {
  super.setUp(cfg)
  sc = new SparkContext("local[4]", "BenchmarkTest")
  sqlContext = new HiveContext(sc)
  ic = new IgniteContext[RddKey, RddVal](sc,
    () ⇒ configuration("client", client = true))
  icCache = ic.fromCache(PARTITIONED_CACHE_NAME)
     icCache.savePairs(  sc.parallelize({
      (0 until 1000).map{ n => (n.toLong, s"Value for key $n")}
        }, 10)) // Error happens here: this is "line 89"
    println(icCache.collect)

}

这是ST:

<20:47:45><yardstick> Failed to start benchmark server (will stop and exit).
org.apache.spark.SparkException: Task not serializable
    at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:166)
    at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:158)
    at org.apache.spark.SparkContext.clean(SparkContext.scala:1623)
    at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:805)
    at org.apache.ignite.spark.IgniteRDD.savePairs(IgniteRDD.scala:170)
    at org.yardstickframework.spark.SparkAbstractBenchmark.setUp(SparkAbstractBenchmark.scala:89)
    at org.yardstickframework.spark.SparkCoreRDDBenchmark.setUp(SparkCoreRDDBenchmark.scala:18)
    at org.yardstickframework.spark.SparkCoreRDDBenchmark$.main(SparkCoreRDDBenchmark.scala:72)
    at org.yardstickframework.spark.SparkNode.start(SparkNode.scala:28)
    at org.yardstickframework.BenchmarkServerStartUp.main(BenchmarkServerStartUp.java:61)
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.serializer.SerializationDebugger$ObjectStreamClassMethods$.getObjFieldValues$extension(SerializationDebugger.scala:240)

看起来您的代码是针对与编译ignite或spark模块不同的Scala版本编译的。 测试针对scala 2.10编译我的代码并且spark运行scala 2.11时,我遇到了类似的异常,反之亦然。 原因可能是模块com.databricks:spark-csv_2.10:1.1.0

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM