简体   繁体   English

scala代码在spark中抛出异常

[英]scala code throw exception in spark

I am new to scala and spark. 我是scala和spark的新手。 Today I tried to write some code, and let it run on spark, but got an exception. 今天我试着编写一些代码,让它在spark上运行,但是有一个例外。

this code work in local scala 此代码在本地scala中工作

import org.apache.commons.lang.time.StopWatch
import org.apache.spark.{SparkConf, SparkContext}

import scala.collection.mutable.ListBuffer
import scala.util.Random

  def test(): List[Int] = {
    val size = 100
    val range = 100
    var listBuffer = new ListBuffer[Int] // here throw an exception
    val random = new Random()
    for (i <- 1 to size)
      listBuffer += random.nextInt(range)
    listBuffer.foreach(x => println(x))
    listBuffer.toList
  }

but when I put this code into spark, it throw an exception says: 但是当我把这段代码放入spark中时,会抛出一个异常说:

15/01/01 14:06:17 INFO SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
Exception in thread "main" java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;
    at com.tudou.sortedspark.Sort$.test(Sort.scala:35)
    at com.tudou.sortedspark.Sort$.sort(Sort.scala:23)
    at com.tudou.sortedspark.Sort$.main(Sort.scala:14)
    at com.tudou.sortedspark.Sort.main(Sort.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

if I comment out the below code, the code work in spark 如果我注释掉下面的代码,代码在spark中工作

for (i <- 1 to size)

can someone explain why, please. 请有人解释原因。

Thanks @Imm, I have solved this issue. 谢谢@Imm,我已经解决了这个问题。 The root cause is that my local scala is 2.11.4, but my spark cluster is running at 1.2.0 version. 根本原因是我的本地scala是2.11.4,但我的spark集群运行在1.2.0版本。 The 1.2 version of spark was compiled by 2.10 scala. 1.2版本的spark由2.10 scala编译。

So the solution is compile local code by 2.10 scala, and upload the compiled jar into spark. 因此解决方案是通过2.10 scala编译本地代码,并将编译后的jar上传到spark。 Everything works fine. 一切正常。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM