简体   繁体   English

在Spark Cluster上安装Apache Shark

[英]apache shark installation on spark cluster

When running shark on spark cluster with one node I'm getting the following error. 在具有一个节点的Spark集群上运行Shark时,出现以下错误。 can anyone please solve it... 谁能解决...
Thanks in advance 提前致谢

error::
Executor updated: app-20140619165031-0000/0 is now FAILED (class java.io.IOException: Cannot run program "/home/trendwise/Hadoop_tools/jdk1.7.0_40/bin/java" (in directory "/home/trendwise/Hadoop_tools/spark/spark-0.9.1-bin-hadoop1/work/app-20140619165031-0000/0"): error=2, No such file or directory)  

In my experience "No such file or directory" is often a symptom of some other exception. 以我的经验,“没有这样的文件或目录”通常是某些其他异常的征兆。 Usually a "no space left on device" and sometimes "too many files open". 通常是“设备上没有剩余空间”,有时“打开的文件太多”。 Mine the logs for other stack traces and monitor your disk usage and inode usage to confirm. 挖掘其他堆栈跟踪的日志,并监视磁盘使用情况和inode使用情况以进行确认。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM