[英]Spark-shell with 'yarn-client' tries to load config from wrong location
I'm trying to launch bin/spark-shell
and bin/pyspark
from laptop, connecting to Yarn cluster in yarn-client
mode, and I get the same error 我正试图从笔记本电脑启动
bin/spark-shell
和bin/pyspark
,在yarn-client
模式下连接到Yarn集群,我得到了同样的错误
WARN ScriptBasedMapping: Exception running
/etc/hadoop/conf.cloudera.yarn1/topology.py 10.0.240.71
java.io.IOException: Cannot run program "/etc/hadoop/conf.cloudera.yarn1/topology.py"
(in directory "/Users/eugenezhulenev/projects/cloudera/spark"): error=2,
No such file or directory
Spark is trying to run /etc/hadoop/conf.cloudera.yarn1/topology.py
on my laptop, but not on worker node in Yarn. Spark试图在我的笔记本电脑上运行
/etc/hadoop/conf.cloudera.yarn1/topology.py
,但不在Yarn的工作节点上运行。
This problem appeared after update from Spark 1.2.0 to 1.3.0 (CDH 5.4.2) 从Spark 1.2.0更新到1.3.0(CDH 5.4.2)后出现此问题
The following steps is a temporarily work-around for this issue on CDH 5.4.4 以下步骤是针对CDH 5.4.4中此问题的临时解决方法
cd ~
mkdir -p test-spark/
cd test-spark/
Then copy all files from /etc/hadoop/conf.clouder.yarn1 from one worker node to the above (local) directory. 然后将/etc/hadoop/conf.clouder.yarn1中的所有文件从一个工作节点复制到上面的(本地)目录。 And then run
spark-shell
from ~/test-spark/
然后从
~/test-spark/
运行spark-shell
The problem is related with infrastructure where Hadoop conf files are not copied as Spark conf file on all nodes. 该问题与Hadoop conf文件未在所有节点上复制为Spark conf文件的基础结构有关。 Some of the node may be missing those files and if you are using that particular node where these files are missing you will hit this problem.
某些节点可能缺少这些文件,如果您正在使用缺少这些文件的特定节点,则会遇到此问题。
When spark starts it looks for the conf files: 1. first at the same location where HADOOP_CONF is located 2. If above 1 location is missing then look at the location from where the spark is started 当spark开始时,它会查找conf文件:1。首先在HADOOP_CONF所在的同一位置2.如果缺少1个以上的位置,请查看火花开始的位置
To solve this problem get the missing folder and look at other nodes and if available on other nodes, copy to the node you where you see the problem. 要解决此问题,请获取缺少的文件夹并查看其他节点,如果在其他节点上可用,请复制到您看到问题的节点。 Otherwise you can just copy the hadoop conf folders as yarn conf in the same location to solve this problem.
否则,您可以将hadoop conf文件夹作为yarn conf复制到同一位置以解决此问题。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.