[英]Mahout spark-shell Initial job has not accepted any resources
I'm just initiating in Mahout and Spark and trying to run the example from mahout's page on this link: 我只是在Mahout和Spark中启动,并尝试从此链接上mahout的页面运行示例:
Playing with Mahout's Spark Shell 玩Mahout的Spark Shell
Everything appears to start but when I try to run the follow command, it returns the error below: 一切似乎开始,但是当我尝试运行follow命令时,它返回以下错误:
val y = drmData.collect(::, 4)
[Stage 0:> (0 + 0) / 2] 15/09/26 18:38:09 WARN TaskSchedulerImpl: Initial job has not accepted any resources;
[Stage 0:>(0 + 0)/ 2] 15/09/26 18:38:09 WARN TaskSchedulerImpl:初始作业未接受任何资源;该任务已执行。 check your cluster UI to ensure that workers are registered and have sufficient resources
检查您的集群用户界面,以确保工作人员已注册并拥有足够的资源
Can anyone help me with this! 谁能帮我这个!
My environment is: 我的环境是:
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-amd64
export MAHOUT_HOME=/home/celso/Downloads/mahout/mahout
export SPARK_HOME=/home/celso/Downloads/spark-1.4.1
export MASTER=spark://celso-VirtualBox:7077
I tried to set MAHOUT_LOCAL to true too. 我也尝试将MAHOUT_LOCAL设置为true。
The Mahout 0.11.x Spark Shell is not yet compatible with Spark 1.4.1. Mahout 0.11.x Spark Shell尚未与Spark 1.4.1兼容。
The most recent release, Mahout 0.11.0, requires Spark 1.3.x. 最新版本Mahout 0.11.0需要Spark1.3.x。
Mahout 0.10.2 is compatible with Spark 1.2.x and earlier. Mahout 0.10.2与Spark 1.2.x和更早版本兼容。
I just made the example work. 我只是使示例工作。
I just setted the environment variable MASTER
to local
: 我只是将环境变量
MASTER
设置为local
:
export MASTER=local
intead of 的
export MASTER=spark://hadoopvm:7077
The example worked! 这个例子成功了!
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.