[英]Spark - Error: Could not find or load main class org.apache.spark.launcher.Main
I just started playing with Spark and I'm already struggling. 我刚刚开始玩Spark,我已经在努力了。 I just downloaded Spark's spark-1.6.1-bin-hadoop2.4
and tried to open PySpark Shell ./bin/pyspark
but I was unfortunately prompted the following: 我刚刚下载了Spark的spark-1.6.1-bin-hadoop2.4
并尝试打开PySpark Shell ./bin/pyspark
但遗憾的是我提示如下:
Error: Could not find or load main class org.apache.spark.launcher.Main
environment: 环境:
Any clues how to troubleshoot this? 有什么线索如何解决这个问题?
It works fine with Spark 1.2.0 Pre-build for Hadoop 2.4 and later 它适用于Hadoop 2.4及更高版本的Spark 1.2.0预构建
请检查机器上是否安装了java并检查lib目录下的所有jar文件是否正常
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.