简体   繁体   English

Spark-submit 在错误的目录中查找

[英]Spark-submit looking in wrong directory

I have just installed Anaconda, Apache spark, Pyspark, Scala on a fresh Linux Mint install (all latest versions). I have just installed Anaconda, Apache spark, Pyspark, Scala on a fresh Linux Mint install (all latest versions).

To test the install I have tried running spark-submit in a terminal but I get the following error:为了测试安装,我尝试在终端中运行spark-submit ,但出现以下错误:

File "/home/jessica/anaconda/bin/find_spark_home.py", line 74, in <module>
    print(_find_spark_home())
  File "/home/jessica/anaconda/bin/find_spark_home.py", line 56, in _find_spark_home
    module_home = os.path.dirname(find_spec("pyspark").origin)
AttributeError: 'NoneType' object has no attribute 'origin'
/home/jessica/anaconda/bin/spark-submit: line 27: /bin/spark-class: No such file or directory

I see that the command is looking in /bin/ instead of in the (correct) /usr/local/spark/bin .我看到该命令正在查找/bin/而不是(正确的) /usr/local/spark/bin

My $PATH variable contains the following: /usr/local/spark/bin:/home/jessica/anaconda/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin::/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games我的$PATH变量包含以下内容: /usr/local/spark/bin:/home/jessica/anaconda/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin::/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games

I also have an env variable called $SPARK_HOME that contains /usr/local/spark/ .我还有一个名为$SPARK_HOME的环境变量,其中包含/usr/local/spark/

How can I tell my system to look in the right directory instead?我怎样才能告诉我的系统去寻找正确的目录呢?

To fix this error I had to manually set the JAVA_HOME variable in /etc/environment要修复此错误,我必须在/etc/environment中手动设置JAVA_HOME变量

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM