简体   繁体   English

java.lang.NoSuchMethodError:org.apache.spark.sql.hive.HiveContext.sql(Ljava / lang / String;)Lorg / apache / spark / sql / DataFrame

[英]java.lang.NoSuchMethodError: org.apache.spark.sql.hive.HiveContext.sql(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame

I'm getting below error while running spark program using spark-submit. 使用spark-submit运行spark程序时出现错误提示。

My spark-cluster is of version 2.0.0 and I use sbt to compile my code and below is my sbt dependencies. 我的spark-cluster版本是2.0.0,我使用sbt编译我的代码,下面是我的sbt依赖项。

libraryDependencies ++= Seq(
  "commons-io" % "commons-io" % "2.4",
  "com.google.guava" % "guava" % "19.0",
  "jfree" % "jfreechart" % "1.0.13",
  ("org.deeplearning4j" % "deeplearning4j-core" % "0.5.0").exclude("org.slf4j", "slf4j-log4j12"),
  "org.jblas" % "jblas" % "1.2.4",
  "org.nd4j" % "canova-nd4j-codec" % "0.0.0.15",
  "org.nd4j" % "nd4j-native" % "0.5.0" classifier "" classifier "linux-x86_64",
  "org.deeplearning4j" % "dl4j-spark" % "0.4-rc3.6" ,
  "org.apache.spark" % "spark-sql_2.10" % "1.3.1", 
  "org.apache.spark" % "spark-hive_2.10" % "1.3.1",
  "org.apache.hive" % "hive-serde" % "0.14.0", 
  ("org.deeplearning4j" % "arbiter-deeplearning4j" % "0.5.0"))



16/11/14 22:57:03 INFO hive.HiveSharedState: Warehouse path is 'file:/home/hduser/spark-warehouse'.
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.hive.HiveContext.sql(Ljava/lang/String;)Lorg/apache/spark/sql/DataFrame;
    at poc.common.utilities.StockData$.fetchStockData(StockData.scala:15)
    at poc.analaticsEngine.AnalaticsStockWorkBench.fetchTrainingDataSet(AnalaticsStockWorkBench.scala:69)
    at poc.analaticsEngine.AnalaticsStockWorkBench.trainModel(AnalaticsStockWorkBench.scala:79)
    at test.poc.analatics.StockPrediction$.testTrainSaveModel(StockPrediction.scala:21)
    at test.poc.analatics.StockPrediction$.main(StockPrediction.scala:10)
    at test.poc.analatics.StockPrediction.main(StockPrediction.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/11/14 22:57:03 INFO spark.SparkContext: Invoking stop() from shutdown hook

First of all, you're saying that you use Spark 2.0.0, but in your dependencies you have 首先,您说的是使用Spark 2.0.0,但是在依赖项中

"org.apache.spark" % "spark-sql_2.10" % "1.3.1", 
"org.apache.spark" % "spark-hive_2.10" % "1.3.1",

You need to change those dependencies to version 2.0.0 to keep it consistent with Spark. 您需要将那些依赖项更改为2.0.0版,以使其与Spark保持一致。 What is more, you don't need to specify spark-sql dependency separately, because it's already contained in spark-hive . 而且,您不需要单独指定spark-sql依赖项,因为它已经包含在spark-hive hive-serde also comes in 2.1.0 version now, so 0.14 is probably obsolete. hive-serde现在也提供2.1.0版本,因此0.14可能已过时。

Please follow the dl4j examples for versioning. 请遵循dl4j示例进行版本控制。 I'm not sure where or how you got canova in there (we haven't used it for nearly 6 months now?) 我不确定您在哪里或那里有canova(我们已经将近6个月没有使用它了吗?)

https://github.com/deeplearning4j/dl4j-examples/blob/master/dl4j-examples/pom.xml https://github.com/deeplearning4j/dl4j-examples/blob/master/dl4j-examples/pom.xml

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 线程“main”中的异常java.lang.NoSuchMethodError:org.apache.spark.sql.SQLContext.sql(Ljava / lang / String;)Lorg / apache / spark / sql / Dataset; - Exception in thread “main” java.lang.NoSuchMethodError: org.apache.spark.sql.SQLContext.sql(Ljava/lang/String;)Lorg/apache/spark/sql/Dataset; java.lang.NoClassDefFoundError-org / apache / spark / sql / hive / HiveContext - java.lang.NoClassDefFoundError - org/apache/spark/sql/hive/HiveContext Apache点燃并引发迭代错误java.lang.NoSuchMethodError:org.apache.spark.sql.SQLContext.createDataFrame - Apache ignite and spark itegration error java.lang.NoSuchMethodError: org.apache.spark.sql.SQLContext.createDataFrame java.lang.NoSuchMethodError:org.apache.spark.ui.SparkUI.addStaticHandler(Ljava / lang / String; Ljava / lang / String; - java.lang.NoSuchMethodError: org.apache.spark.ui.SparkUI.addStaticHandler(Ljava/lang/String;Ljava/lang/String; HA 中的 Spark:java.lang.NoSuchMethodError: org.apache.curator.utils.PathUtils.validatePath(Ljava/lang/String;)Ljava/lang/String; - Spark in HA: java.lang.NoSuchMethodError: org.apache.curator.utils.PathUtils.validatePath(Ljava/lang/String;)Ljava/lang/String; 结构化Spark Streaming抛出java.lang.NoSuchMethodError:org.apache.spark.sql.SQLContext.internalCreateDataFrame - Structured Spark Streaming throwing java.lang.NoSuchMethodError: org.apache.spark.sql.SQLContext.internalCreateDataFrame java.lang.NoSuchMethodError:org.apache.spark.sql.DataFrameReader.parquet - java.lang.NoSuchMethodError: org.apache.spark.sql.DataFrameReader.parquet java.lang.NoSuchMethodError: org.apache.spark.sql.internal.SQLConf.useDeprecatedKafkaOffsetFetching()Z - java.lang.NoSuchMethodError: org.apache.spark.sql.internal.SQLConf.useDeprecatedKafkaOffsetFetching()Z NoClassDefFoundError:org/apache/spark/sql/hive/HiveContext - NoClassDefFoundError:org/apache/spark/sql/hive/HiveContext Spark 1.3.1 SQL Lib:线程“ main”中的异常java.lang.NoSuchMethodError:org.apache.spark.sql.SQLContext.implicits() - Spark 1.3.1 SQL Lib: Exception in thread “main” java.lang.NoSuchMethodError: org.apache.spark.sql.SQLContext.implicits()
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM