简体   繁体   English

Apache Spark:java.lang.NoSuchMethodError .rddToPairRDDFunctions

[英]Apache Spark: java.lang.NoSuchMethodError .rddToPairRDDFunctions

sbt package runs just fine, but after spark-submit I get the error: sbt package运行得很好,但是在spark-submit我收到错误:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.SparkContext$.rddToPairRDDFunctions(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/math/Ordering;)Lorg/apache/spark/rdd/PairRDDFunctions;线程“main”中的异常 java.lang.NoSuchMethodError: org.apache.spark.SparkContext$.rddToPairRDDFunctions(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/math/Ordering ;)Lorg/apache/spark/rdd/PairRDDFunctions; at SmokeStack$.main(SmokeStack.scala:46) at SmokeStack.main(SmokeStack.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)在 SmokeStack$.main(SmokeStack.scala:46) 在 SmokeStack.main(SmokeStack.scala) 在 sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 在 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 在 sun .reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$ SparkSubmit$$runMain(SparkSubmit.scala:736) 在 org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185) 在 org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala: 210) 在 org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124) 在 org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Here is the offending line:这是违规行:

val sigCounts = rowData.map(row => (row("Signature"), 1)).countByKey()

rowData is an RDD Map[String, String]. rowData是一个 RDD Map[String, String]。 "Signature" key exists in all items in the map. “签名”键存在于地图中的所有项目中。

I suspect this may be a build issue.我怀疑这可能是构建问题。 Below is my sbt file:下面是我的 sbt 文件:

name := "Example1"
version := "0.1"
scalaVersion := "2.11.8"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"
scalacOptions ++= Seq("-feature")

I'm new to Scala so maybe the imports are not correct?我是 Scala 的新手,所以也许导入不正确? I have:我有:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import scala.io.Source

java.lang.NoSuchMethodError is often an indication that the version the code was compiled against is on a higher version than the libraries used at runtime. java.lang.NoSuchMethodError通常表示编译的代码版本的版本高于运行时使用的库。

With Spark, that means that the Spark version used to compile is different from the one deployed (on the machine or cluster). 使用Spark,这意味着用于编译的Spark版本与部署的版本(在计算机或集群上)不同。

Aligning the versions between development and runtime should solve this issue. 在开发和运行时之间对齐版本应该可以解决此问题。

I Was facing the same problem while reading a simple oneline json file into a dataframe and showing it using .show() method. 我在将简单的oneline json文件读入数据帧并使用.show()方法显示时遇到了同样的问题。 I would get this error on myDF.show() line of code. 我会在myDF.show()代码行中得到此错误。

For me it turned out to be wrong version of spark-sql library in the build. 对我来说,结果是构建中的spark-sql库版本错误。

ie I was having in my External Libraries from SBT , instead of . 即我在SBT的外部图书馆,而不是。

Adding following line to my build.sbt resolved the issue 在build.sbt中添加以下行解决了这个问题

libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"

如果更新一个spark依赖项的版本,最安全的是将它们全部更新为相同的版本

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Apache Spark - java.lang.NoSuchMethodError:breeze.linalg.DenseVector - Apache Spark - java.lang.NoSuchMethodError: breeze.linalg.DenseVector java.lang.NoSuchMethodError Jackson 数据绑定和 Spark - java.lang.NoSuchMethodError Jackson databind and Spark 火花提交中的 java.lang.NoSuchMethodError - java.lang.NoSuchMethodError in spark-submit Spark Streaming - java.lang.NoSuchMethodError错误 - Spark Streaming - java.lang.NoSuchMethodError Error 无法运行火花壳! java.lang.NoSuchMethodError:org.apache.spark.repl.SparkILoop.mumly - Can't run spark shell ! java.lang.NoSuchMethodError: org.apache.spark.repl.SparkILoop.mumly Apache Spark-java.lang.NoSuchMethodError:breeze.linalg.Vector $ .scalarOf()Lbreeze / linalg / support / ScalarOf - Apache Spark - java.lang.NoSuchMethodError: breeze.linalg.Vector$.scalarOf()Lbreeze/linalg/support/ScalarOf 使用java.lang.NoSuchMethodError火花读取HBase:org.apache.hadoop.mapreduce.InputSplit.getLocationInfo错误 - Spark Read HBase with java.lang.NoSuchMethodError: org.apache.hadoop.mapreduce.InputSplit.getLocationInfo error Apache Toree Spark内核无法启动(java.lang.NoSuchMethodError) - Apache Toree Spark kernel doesn't start (java.lang.NoSuchMethodError) java.lang.NoSuchMethodError: org.apache.spark.sql.internal.SQLConf.useDeprecatedKafkaOffsetFetching()Z - java.lang.NoSuchMethodError: org.apache.spark.sql.internal.SQLConf.useDeprecatedKafkaOffsetFetching()Z Spark 1.5.1 + Scala 2.10 + Kafka + Cassandra = Java.lang.NoSuchMethodError: - Spark 1.5.1 + Scala 2.10 + Kafka + Cassandra = Java.lang.NoSuchMethodError:
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM