簡體   English   中英

在hadoop安裝中找不到start-all.sh

[英]cant find start-all.sh in hadoop installation

我正在嘗試在我的本地計算機上設置hadoop,並且正在執行此操作 我也在家里設置了hadoop

這是我現在嘗試運行的命令

hduser@ubuntu:~$ /usr/local/hadoop/bin/start-all.sh

這是我得到的錯誤

-su: /usr/local/hadoop/bin/start-all.sh: No such file or directory

這就是我添加到$ HOME / .bashrc文件中的內容

# Set Hadoop-related environment variables
export HADOOP_HOME=/usr/local/hadoop

# Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on)
export JAVA_HOME=/usr/lib/jvm/java-8-oracle

# Some convenient aliases and functions for running Hadoop-related commands
unalias fs &> /dev/null
alias fs="hadoop fs"
unalias hls &> /dev/null
alias hls="fs -ls"

# If you have LZO compression enabled in your Hadoop cluster and
# compress job outputs with LZOP (not covered in this tutorial):
# Conveniently inspect an LZOP compressed file from the command
# line; run via:
#
# $ lzohead /hdfs/path/to/lzop/compressed/file.lzo
#
# Requires installed 'lzop' command.
#
lzohead () {
    hadoop fs -cat $1 | lzop -dc | head -1000 | less
}

# Add Hadoop bin/ directory to PATH
export PATH=$PATH:$HADOOP_HOME/bin

編輯嘗試了mahendra給定的解決方案后,我得到以下輸出

This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh Starting namenodes on [localhost] localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-hduser-namenode-mmt-HP-ProBook-430-G3.out localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hduser-datanode-mmt-HP-ProBook-430-G3.out Starting secondary namenodes [0.0.0.0] 0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-mmt-HP-ProBook-430-G3.out starting yarn daemons starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-hduser-resourcemanager-mmt-HP-ProBook-430-G3.out localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-mmt-HP-ProBook-430-G3.out

嘗試運行:

hduser@ubuntu:~$ /usr/local/hadoop/sbin/start-all.sh

由於start-all.shstop-all.sh位於sbin目錄中,而hadoop二進制文件位於bin目錄中。

還為.bashrc更新了:

導出PATH = $ PATH:$ HADOOP_HOME / bin: $ HADOOP_HOME / sbin

這樣您就可以直接訪問start-all.sh

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM