简体   繁体   English

Spark Job Server中Spark作业“ java.lang.NoClassDefFoundError:org / apache / spark / sql / SQLContext”的错误

[英]The error of Spark job “java.lang.NoClassDefFoundError: org/apache/spark/sql/SQLContext” in Spark Job Server

I create a spark job with IntelliJ , and i want it be loaded and run by spark Job-Server. 我使用IntelliJ创建了一个spark作业,并且希望它由spark Job-Server加载并运行。 For this i followed the steps in this link : http://github.com/ooyala/spark-jobserver And the version of my spark is 1.4.0. 为此,我遵循此链接中的步骤: http : //github.com/ooyala/spark-jobserver我的spark版本是1.4.0。

This is the scala code in my project : 这是我的项目中的scala代码:

import org.apache.spark.sql.DataFrame
import org.apache.spark.sql.hive.HiveContext

import org.apache.spark.{SparkConf, SparkContext}

import scala.collection.mutable.ArrayBuffer
//spark job server
import com.typesafe.config.{Config, ConfigFactory}
import scala.util.Try
import spark.jobserver.SparkJob
import spark.jobserver.SparkJobValidation
import spark.jobserver.SparkJobValid
import spark.jobserver.SparkJobInvalid

class hiveSparkRest extends SparkJob {
  var idCard:String =""


  def main(args: Array[String]): Unit = {


    val sc = new SparkContext("local[4]", "SmartApp")
    val config = ConfigFactory.parseString("")

    val results = runJob(sc, config)
    println("Result is " + results)


    enterTimesMax(sc, hiveContext)

  }


  override def validate(sc: SparkContext, config: Config): SparkJobValidation = {
    Try(config.getString("input.string"))
      .map(x => SparkJobValid)
      .getOrElse(SparkJobInvalid("No input.string config param"))
  }

  override def runJob(sc: SparkContext,config: Config): Any = {
    idCard = config.getString("input.string")
    enterTimesMax(sc, hiveContext)
  }

  def enterTimesMax(sc:SparkContext,hiveContext:HiveContext): Unit = {
    val hiveContext = new HiveContext(sc)
    hiveContext.sql("use default")

    val sqlUrl = "select max(num) from (select idcard,count(1) as num from passenger group by idcard)as t"

    val idCardArray = hiveContext.sql(sqlUrl).collect()


  }
}

But when i execute it i got curl: (52) Empty reply from server with this error in spark job-server: 但是,当我执行它时,我会卷曲:(52)来自服务器的空答复,其中包含火花作业服务器中的此错误:

> job-server[ERROR] Uncaught error from thread [JobServer-akka.actor.default-dispatcher-12] shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for ActorSystem[JobServer]
job-server[ERROR] java.lang.NoClassDefFoundError: org/apache/spark/sql/SQLContext
job-server[ERROR]   at java.lang.ClassLoader.defineClass1(Native Method)
job-server[ERROR]   at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
job-server[ERROR]   at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
job-server[ERROR]   at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
job-server[ERROR]   at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
job-server[ERROR]   at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
job-server[ERROR]   at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
job-server[ERROR]   at java.security.AccessController.doPrivileged(Native Method)
job-server[ERROR]   at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
job-server[ERROR]   at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
job-server[ERROR]   at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
job-server[ERROR]   at sql.hiveSparkRest.shadePassenger(hiveSparkRest.scala:62)
job-server[ERROR]   at sql.hiveSparkRest.runJob(hiveSparkRest.scala:56)
job-server[ERROR]   at spark.jobserver.JobManagerActor$$anonfun$spark$jobserver$JobManagerActor$$getJobFuture$4.apply(JobManagerActor.scala:222)
job-server[ERROR]   at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
job-server[ERROR]   at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
job-server[ERROR]   at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:42)
job-server[ERROR]   at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
job-server[ERROR]   at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
job-server[ERROR]   at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
job-server[ERROR]   at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
job-server[ERROR]   at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
job-server[ERROR] Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.SQLContext
job-server[ERROR]   at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
job-server[ERROR]   at java.security.AccessController.doPrivileged(Native Method)
job-server[ERROR]   at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
job-server[ERROR]   at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
job-server[ERROR]   at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
job-server[ERROR]   ... 22 more
job-server ... finished with exit code 255

Seems the class HiveContext is supported by spark jar file spark-assembly-1.4.0-hadoop1.0.4.jar. 似乎Spark jar文件spark-assembly-1.4.0-hadoop1.0.4.jar支持HiveContext类。

I don't think ooyala repo is the main one. 我不认为ooyala回购是主要的回购。 In the maintained repo, the below link shows a test job for using HiveContext. 在维护的仓库中,以下链接显示了使用HiveContext的测试作业。 For SparkHiveJob trait you need jobserver-extras jar. 对于SparkHiveJob特性,您需要jobserver-extras jar。

https://github.com/spark-jobserver/spark-jobserver/blob/master/job-server-extras/src/spark.jobserver/HiveTestJob.scala https://github.com/spark-jobserver/spark-jobserver/blob/master/job-server-extras/src/spark.jobserver/HiveTestJob.scala

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 线程“main”中的异常 java.lang.NoClassDefFoundError: org/apache/spark/sql/Column - Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/spark/sql/Column IntelliJ:线程“ main”中的异常java.lang.NoClassDefFoundError:org / apache / spark / sql / types / DataType - IntelliJ: Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/spark/sql/types/DataType 如何修复此Scala jar错误“线程“ main”中的异常java.lang.NoClassDefFoundError:org / apache / spark / sql / types / DataType” - How to fix this scala jar error “Exception in thread ”main“ java.lang.NoClassDefFoundError: org/apache/spark/sql/types/DataType” HD insight Spark从事件中心获取消息java.lang.NoClassDefFoundError:org / apache / spark / eventhubs / ConnectionStringBuilder $ error - HD insight Spark retrieve message from event hub getting java.lang.NoClassDefFoundError: org/apache/spark/eventhubs/ConnectionStringBuilder$ error 线程“main”中的异常 java.lang.NoClassDefFoundError: org/apache/spark/ml/feature/VectorAssembler in IntelliJ - Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/ml/feature/VectorAssembler in IntelliJ 线程“main”中的异常 java.lang.NoClassDefFoundError: org/apache/spark/streaming/StreamingContext - Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/StreamingContext 错误:引起:java.lang.ClassNotFoundException:org.apache.spark.sql.SparkSession$ - Error :Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.SparkSession$ NoClassDefFoundError: org/apache/spark/sql/SparkSession$ 在本地运行 spark 源代码时 - NoClassDefFoundError: org/apache/spark/sql/SparkSession$ while running spark source code locally java.lang.NoSuchMethodError:org.apache.spark.sql.DataFrameReader.parquet - java.lang.NoSuchMethodError: org.apache.spark.sql.DataFrameReader.parquet NoClassDefFoundError:无法初始化类 org.apache.spark.package - NoClassDefFoundError: Could Not Initialise class org.apache.spark.package
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM