简体   繁体   English

Scala代码在spark-shell上有效,但在spark-submit中不起作用

[英]scala code works on spark-shell but not in spark-submit

The following is the main scala code 以下是主要的scala代码

1.val conf=new SparkConf()
2.conf.setMaster("spark://master:7077")
3.conf.setAppName("Commnity Detective")
4.val sc=new SparkContext(conf)
5.val rdd=sc.textFile("hdfs://master:9000/lvcy/test/ungraph/test.txt")
6.val maprdd=rdd.map(line =>{val p=line.split("\\s+");(p(0),p(1))}) union rdd.map( line =>{val p=line.split("\\s+");(p(1),p(0))})
7.val reducerdd=maprdd.reduceByKey((a,b)=>a+"\t"+b)
8.val reduceArray=reducerdd.collect()
9.val reducemap=reduceArray.toMap

Problem statement: 问题陈述:

  1. copy the code(line:5-9) running on spark-shell, the result is right 复制运行在spark-shell上的代码(行:5-9),结果正确
  2. if put the code to the Eclipse and generate jar packages,then use "spark-submit" to submit the job, there has next error("Main:scala:21" is the top line:9, that is to say the method toMap Error, WHY? ): 如果将代码放入Eclipse并生成jar包,然后使用“ spark-submit”提交作业,则会出现下一个错误(“ Main:scala:21”是最上面的一行:9,即toMap方法错误, 为什么? ):

     Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less; at net.lvcy.main.Main$.main(Main.scala:21) at net.lvcy.main.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 

It looks like a Scala version mismatch. 看起来像Scala版本不匹配。 You should make sure that the Scala version used to generate your jar is the same as the Scala version of your Spark cluster binaries, eg 2.10 . 您应该确保用于生成jar的Scala版本与Spark集群二进制文件的Scala版本相同,例如2.10

Prebuild Spark发行版是使用Scala 2.10编译的,因此请确保在Scala 2.10下运行spark集群。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM