[英]Can't run spark shell ! java.lang.NoSuchMethodError: org.apache.spark.repl.SparkILoop.mumly
hadoop@youngv-VirtualBox:/usr/local/spark$ ./bin/spark-shell
18/11/30 23:32:38 WARN Utils: Your hostname, youngv-VirtualBox resolves to a loopback address: 127.0.0.1; using 10.0.2.15 instead (on interface enp0s3)
18/11/30 23:32:38 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
18/11/30 23:32:40 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.repl.SparkILoop.mumly(Lscala/Function0;)Ljava/lang/Object;
线程“主”中的异常java.lang.NoSuchMethodError:org.apache.spark.repl.SparkILoop.mumly(Lscala / Function0;)Ljava / lang / Object; at org.apache.spark.repl.SparkILoop$$anonfun$process$1.org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1(SparkILoop.scala:199) at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:267) at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:247) at org.apache.spark.repl.SparkILoop$$anonfun$process$1.withSuppressedSettings$1(SparkILoop.scala:235) at org.apache.spark.repl.SparkILoop$$anonfun$process$1.startup$1(SparkILoop.scala:247) at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:282) at org.apache.spark.repl.SparkILoop.runClosure(SparkILoop.scala:159) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:182) at org.apache.spark.repl.Main$.doMain(Main.scala:78) at org.apache.spark.repl.Main$.main(Main.scala:58) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
在org.apache.spark.repl.SparkILoop $$ anonfun $ process $ 1.org $ apache $ spark $ repl $ SparkILoop $$ anonfun $ loopPostInit $ 1(SparkILoop.scala:199)在org.apache.spark.repl.SparkILoop $$ anonfun $ process $ 1 $$ anonfun $ startup $ 1 $ 1.apply(SparkILoop.scala:267)在org.apache.spark.repl.SparkILoop $$ anonfun $ process $ 1 $ anonfun $ startup $ 1 $ 1.apply(SparkILoop。 scala:247)在org.apache.spark.repl.SparkILoop $$ anonfun $ process $ 1.withSuppressedSettings $ 1(SparkILoop.scala:235)在org.apache.spark.repl.SparkILoop $$ anonfun $ process $ 1.startup $ 1( SparkILoop.scala:247)位于org.apache.spark.repl.SparkILoop $$ anonfun $ process $ 1.apply $ mcZ $ sp(SparkILoop.scala:282)位于org.apache.spark.repl.SparkILoop.runClosure(SparkILoop。 scala:159)位于org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:182)位于org.apache.spark.repl.Main $ .doMain(Main.scala:78)位于org.apache.spark。 org.apache.spark.repl.Main.main(Main.scala)处的repl.Main $ .main(Main.scala:58)在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)
while I want to run the spark-shell, but appear error with: spark-2.4.0 scala-2.11.12 jdk-1.8 Anyone could tell me how to solve this problem? I will be very grateful. 虽然我想运行spark-shell,但出现以下错误:spark-2.4.0 scala-2.11.12 jdk-1.8任何人都可以告诉我如何解决此问题?我将不胜感激。
程序集类路径中可能有不同的jar版本,请将其删除并尝试再次构建。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.