简体   繁体   English

使用spark-submit提交spark scala作业时出错

[英]Error while submitting a spark scala job using spark-submit

I have written a simple app in scala using Eclipse -> New Scala Project. 我已经使用Eclipse-> New Scala Project在Scala中编写了一个简单的应用程序。

I am using Scala 2.10.6 and Spark 2.0.2. 我正在使用Scala 2.10.6和Spark 2.0.2。 The app is compiling without error and I also exported the jar file. 该应用程序正在编译,没有错误,我还导出了jar文件。

I am using the following command to execute the JAR 我正在使用以下命令执行JAR

spark-submit  TowerTest.jar --class com.IFTL.EDI.LocateTower MobLocationData Output1

The scala code snippet is as follows Scala代码段如下

package com.IFTL.EDI

import scala.math.pow
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object LocateTower {
def main(args: Array[String]){

//create Spark context with Spark configuration
val sc = new SparkContext(new SparkConf().setAppName("TowerCount"))

//helper to add locations function used as a helper in finding tower 
centroid
def addLocations(p1: (Double,Double), p2: (Double,Double)) ={
(p1._1 + p2._1,p1._2 + p2._2)
 }
}

This is not the full code. 这不是完整的代码。 when I run this, I get the following error. 运行此命令时,出现以下错误。

[cloudera@quickstart ~]$ spark-submit  --class com.IFTL.EDI.LocateTower 
 TowerTest.jar MobLocationData LocationOut1
 java.lang.ClassNotFoundException: com.IFTL.EDI.LocateTower
 at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
 at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
 at java.lang.Class.forName0(Native Method)
 at java.lang.Class.forName(Class.java:270)
 at org.apache.spark.util.Utils$.classForName(Utils.scala:176)
 at 

org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$
runMain(SparkSubmit.scala:689)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

I am new to spark and scala, so not sure what I'm missing. 我是Spark和Scala的新手,所以不确定我缺少什么。

Try with this order, if you want to pass some parameters you can put them after the jar file now, make sure you specify path to the jar or run it from jar location spark-submit --class com.IFTL.EDI.LocateTower /Users/myJarFilePath/TowerTest.jar 按此顺序尝试,如果要传递一些参数,可以立即将它们放在jar文件之后,请确保指定jar的路径,或从jar位置运行它。spark-submit --class com.IFTL.EDI.LocateTower /用户/myJarFilePath/TowerTest.jar

Try like that first, ones you have it working you can add command line arguments 首先尝试一下,您可以使用它,您可以添加命令行参数

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM