简体   繁体   English

火花提交:找不到命令

[英]spark-submit: command not found

A very simple question:一个非常简单的问题:

I try to use a bash script to submit spark jobs.我尝试使用 bash 脚本来提交 spark 作业。 But somehow it keeps complaining that it cannot find spark-submit command.但不知何故,它一直抱怨找不到spark-submit命令。 But when I just copy out the command and run directly in my terminal, it runs fine.但是当我只是复制命令并直接在我的终端中运行时,它运行良好。

My shell is fish shell, here's what I have in my fish shell config: ~/.config/fish/config.fish :我的外壳是鱼壳,这是我的鱼壳配置中的内容: ~/.config/fish/config.fish

alias spark-submit='/Users/MY_NAME/Downloads/spark-2.0.2-bin-hadoop2.7/bin/spark-submit'

Here's my bash script:这是我的 bash 脚本:

#!/usr/bin/env bash


SUBMIT_COMMAND="HADOOP_USER_NAME=hdfs spark-submit \
      --master $MASTER \
      --deploy-mode client \
      --driver-memory $DRIVER_MEMORY \
      --executor-memory $EXECUTOR_MEMORY \
      --num-executors $NUM_EXECUTORS \
      --executor-cores $EXECUTOR_CORES \
      --conf spark.shuffle.compress=true \
      --conf spark.network.timeout=2000s \
      $DEBUG_PARAM \
      --class com.fisher.coder.OfflineIndexer \
      --verbose \
      $JAR_PATH \
      --local $LOCAL \
      $SOLR_HOME \
      --solrconfig 'resource:solrhome/' \
      $ZK_QUORUM_PARAM \
      --source $SOURCE \
      --limit $LIMIT \
      --sample $SAMPLE \
      --dest $DEST \
      --copysolrconfig \
      --shards $SHARDS \
      $S3_ZK_ZNODE_PARENT \
      $S3_HBASE_ROOTDIR \
      "

eval "$SUBMIT_COMMAND"

What I've tried: I could run this command perfectly fine on my Mac OS X fish shell when I copy this command literally out and directly run.我试过的:当我将这个命令从字面上复制出来并直接运行时,我可以在我的 Mac OS X 鱼壳上完美地运行这个命令。 However, what I wanted to achieve is to be able to run ./submit.sh -local which executes the above shell.但是,我想要实现的是能够运行./submit.sh -local来执行上述 shell。

Any clues please?请问有什么线索吗?

You seem to be confused about what a fish alias is.您似乎对什么是鱼别名感到困惑。 When you run this:当你运行这个:

alias spark-submit='/Users/MY_NAME/Downloads/spark-2.0.2-bin-hadoop2.7/bin/spark-submit'

You are actually doing this:你实际上是在这样做:

function spark-submit
   /Users/MY_NAME/Downloads/spark-2.0.2-bin-hadoop2.7/bin/spark-submit $argv
end

That is, you are defining a fish function.也就是说,您正在定义一个鱼函数。 Your bash script has no knowledge of that function.您的 bash 脚本不知道该功能。 You need to either put that path in your $PATH variable or put a similar alias command in your bash script.您需要将该路径放入您的$PATH变量中,或者在您的 bash 脚本中放入一个类似的别名命令。

Make sure this command is added to path:确保将此命令添加到路径:

 export PATH=$PATH:/Users/{your_own_path_where_spark_installed}/bin

For mac , open either one of these files ~/.bash , ~/.zprofile , ~/.zshrc and add the command below in the file.对于mac ,打开这些文件之一~/.bash~/.zprofile~/.zshrc /.zshrc 并在文件中添加以下命令。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM