繁体   English   中英

在hadoop安装中找不到start-all.sh

[英]cant find start-all.sh in hadoop installation

我正在尝试在我的本地计算机上设置hadoop,并且正在执行此操作 我也在家里设置了hadoop

这是我现在尝试运行的命令

hduser@ubuntu:~$ /usr/local/hadoop/bin/start-all.sh

这是我得到的错误

-su: /usr/local/hadoop/bin/start-all.sh: No such file or directory

这就是我添加到$ HOME / .bashrc文件中的内容

# Set Hadoop-related environment variables
export HADOOP_HOME=/usr/local/hadoop

# Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on)
export JAVA_HOME=/usr/lib/jvm/java-8-oracle

# Some convenient aliases and functions for running Hadoop-related commands
unalias fs &> /dev/null
alias fs="hadoop fs"
unalias hls &> /dev/null
alias hls="fs -ls"

# If you have LZO compression enabled in your Hadoop cluster and
# compress job outputs with LZOP (not covered in this tutorial):
# Conveniently inspect an LZOP compressed file from the command
# line; run via:
#
# $ lzohead /hdfs/path/to/lzop/compressed/file.lzo
#
# Requires installed 'lzop' command.
#
lzohead () {
    hadoop fs -cat $1 | lzop -dc | head -1000 | less
}

# Add Hadoop bin/ directory to PATH
export PATH=$PATH:$HADOOP_HOME/bin

编辑尝试了mahendra给定的解决方案后,我得到以下输出

This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh Starting namenodes on [localhost] localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-hduser-namenode-mmt-HP-ProBook-430-G3.out localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hduser-datanode-mmt-HP-ProBook-430-G3.out Starting secondary namenodes [0.0.0.0] 0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-mmt-HP-ProBook-430-G3.out starting yarn daemons starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-hduser-resourcemanager-mmt-HP-ProBook-430-G3.out localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-mmt-HP-ProBook-430-G3.out

尝试运行:

hduser@ubuntu:~$ /usr/local/hadoop/sbin/start-all.sh

由于start-all.shstop-all.sh位于sbin目录中,而hadoop二进制文件位于bin目录中。

还为.bashrc更新了:

导出PATH = $ PATH:$ HADOOP_HOME / bin: $ HADOOP_HOME / sbin

这样您就可以直接访问start-all.sh

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM