[英]Can't find spark submit when typing spark-shell
I've been following this tutorial to install spark for scala: https://www.tutorialspoint.com/apache_spark/apache_spark_installation.htm我一直在按照本教程为 scala 安装 spark: https://www.tutorialspoint.com/apache_spark/apache_spark_installation.htm
However, When I try to run spark-shell
I receive this error in my console.但是,当我尝试运行
spark-shell
时,我在控制台中收到此错误。
/usr/local/spark/bin/spark-shell: line 57: /usr/local/spark/bin/bin/spark-submit: No such file or directory
My bashrc looks like this:我的 bashrc 看起来像这样:
export PATH = $PATH:/usr/local/spark/bin
export SCALA_HOME=/usr/local/scala/bin
export PYTHONPATH=$SPARK_HOME/python
So what am I getting wrong?那我错了什么? I've installed spark for python before but now I'm trying to use scala. Is spark confusing the variables?
我之前已经为 python 安装了 spark,但现在我正在尝试使用 scala。spark 是否混淆了变量? Thanks.
谢谢。
You have one bin
too many in the path it's searching: 你正在搜索的路径中有一个
bin
太多了:
/usr/local/spark/bin/bin/spark-submit
should be 应该
/usr/local/spark/bin/spark-submit
The SPARK_HOME
should be /usr/local/spark/
in your case, not /usr/local/spark/bin/
as it seems to be the case now. 在你的情况下
SPARK_HOME
应该是/usr/local/spark/
,而不是/usr/local/spark/bin/
因为现在看起来似乎是这样。
Based on @Wilmerton's answer, I came up with the following working configuration inside my ~/.bashrc
: 基于@Wilmerton的回答,我在
~/.bashrc
提出了以下工作配置:
# Apache Spark stuff
export JAVA_HOME=/usr/lib/jvm/default-java/jre
export SPARK_HOME=/usr/lib/spark
export SCALA_HOME=/usr/local/scala/bin
export PATH=$PATH:${SPARK_HOME}/bin
export PATH=$PATH:$SCALA_HOME
(I installed default-jdk
with apt-get install default-jdk
and aptitude search jdk
yields the following entries with different from p
status: (我安装
default-jdk
带apt-get install default-jdk
和aptitude search jdk
产生具有不同以下条目p
状态:
i default-jdk - Standard Java or Java compatible Development Kit
i A default-jdk-headless - Standard Java or Java compatible Development Kit (headless)
i A openjdk-8-jdk - OpenJDK Development Kit (JDK)
i A openjdk-8-jdk-headless - OpenJDK Development Kit (JDK) (headless)
iBA openjdk-8-jre - OpenJDK Java runtime, using Hotspot JIT
i A openjdk-8-jre-headless - OpenJDK Java runtime, using Hotspot JIT (headless)
i openjdk-9-jdk-headless - OpenJDK Development Kit (JDK) (headless)
iB openjdk-9-jre - OpenJDK Java runtime, using Hotspot JIT
i A openjdk-9-jre-headless - OpenJDK Java runtime, using Hotspot JIT (headless)
) )
Try changing the second environment variable in your bashrc to this 尝试将bashrc中的第二个环境变量更改为此
export PATH=$PATH:/usr/local/scala/bin
My .bash_profile is pretty much the same as yours except I use Path
instead of SCALA_HOME
除了使用
Path
而不是SCALA_HOME
之外,我的.bash_profile与你的几乎相同
i got the same problem, i solved it this way我遇到了同样的问题,我是这样解决的
!sudo apt update
!apt-get install openjdk-8-jdk-headless -qq > /dev/null
!wget -q https://dlcdn.apache.org/spark/spark-3.2.1/spark-3.2.1-bin-hadoop3.2.tgz
!tar xf spark-3.2.1-bin-hadoop3.2.tgz
!pip install -q findspark
!pip install pyspark
import os
os.environ["JAVA_HOME"] = "/usr/lib/jvm/java-8-openjdk-amd64"
os.environ["SPARK_HOME"] = "/content/spark-3.2.1-bin-hadoop3.2"
import findspark
findspark.init()
findspark.find()
from pyspark.sql import DataFrame, SparkSession
from typing import List
import pyspark.sql.types as T
import pyspark.sql.functions as F
spark = SparkSession \
.builder \
.appName("Our First Spark example") \
.getOrCreate()
spark
ao importar a biblioteca e criar uma sessao spark from pyspark.sql import SparkSession o comando abaixo esta errado ao importar a biblioteca e criar uma sessao spark from pyspark.sql import SparkSession o comando abaixo esta errado
spark= SparkSession.builder.getOrCreate() spark=SparkSession.builder.getOrCreate()
o erro que aparece. o erro que aparece。 FileNotFoundError: [Errno 2] No such file or directory: '/path/to/spark/./bin/spark-submit': '/path/to/spark/./bin/spark-submit'
FileNotFoundError:[Errno 2] 没有这样的文件或目录:'/path/to/spark/./bin/spark-submit':'/path/to/spark/./bin/spark-submit'
Alguem saber me dizer. Alguem saber me dizer。 Grata,
格拉塔,
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.