简体   繁体   中英

Can not start spark on my cluster

Information in log file spark-hadoop-org.apache.spark.deploy.master.Master-1-master.outis as bellow:

在此处输入图片说明

And in log file spark-hadoop-org.apache.spark.deploy.worker.Worker-1-master.out it says:

在此处输入图片说明

Help please. My spark version is:spark-1.6.0-bin-without-hadoop.tgz Scala version is:2.10.5 Hadoop version is:2.6.0

You need to add slf4j.api-1.6.1.jar under $SPARK_HOME/lib directory on each spark workers node as well as on spark master node. Once done please restart the master/worker process.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM