簡體   English   中英

Hadoop錯誤無法啟動--all.sh

[英]Hadoop error can not start-all.sh

我在我的筆記本電腦單模式下設置了一個hadoop。 info:Ubuntu 12.10,jdk 1.7 oracle,從.deb文件安裝hadoop。 位置:/ etc / hadoop / usr / share / hadoop

我在/usr/share/hadoop/templates/conf/core-site.xml中配置我添加了2個屬性

    <property>
  <name>hadoop.tmp.dir</name>
  <value>/app/hadoop/tmp</value>
  <description>A base for other temporary directories.</description>
</property>

<property>
  <name>fs.default.name</name>
  <value>hdfs://localhost:9000</value>
  <description>The name of the default file system.  A URI whose
  scheme and authority determine the FileSystem implementation.  The
  uri's scheme determines the config property (fs.SCHEME.impl) naming
  the FileSystem implementation class.  The uri's authority is used to
  determine the host, port, etc. for a filesystem.</description>
</property>

在hdfs-site.xml中

<property>
  <name>dfs.replication</name>
  <value>1</value>
  <description>Default block replication.
  The actual number of replications can be specified when the file is created.
  The default is used if replication is not specified in create time.
  </description>
</property>

在mapred-site.xml中

    <property>
  <name>mapred.job.tracker</name>
  <value>localhost:9001</value>
  <description>The host and port that the MapReduce job tracker runs
  at.  If "local", then jobs are run in-process as a single map
  and reduce task.
  </description>
</property>

當我從命令hduser @ sepdau開始:〜$ start-all.sh

starting namenode, logging to /var/log/hadoop/hduser/hadoop-hduser-namenode-sepdau.com.out
localhost: starting datanode, logging to /var/log/hadoop/hduser/hadoop-hduser-datanode-sepdau.com.out
localhost: starting secondarynamenode, logging to /var/log/hadoop/hduser/hadoop-hduser-secondarynamenode-sepdau.com.out
starting jobtracker, logging to /var/log/hadoop/hduser/hadoop-hduser-jobtracker-sepdau.com.out
localhost: starting tasktracker, logging to /var/log/hadoop/hduser/hadoop-hduser-tasktracker-sepdau.com.out

但是當我通過jps查看進程時

hduser@sepdau:~$ jps
13725 Jps

更多

 root@sepdau:/home/sepdau# netstat -plten | grep java
tcp6       0      0 :::8080                 :::*                    LISTEN      117        9953        1316/java       
tcp6       0      0 :::53976                :::*                    LISTEN      117        16755       1316/java       
tcp6       0      0 127.0.0.1:8700          :::*                    LISTEN      1000       786271      8323/java       
tcp6       0      0 :::59012                :::*                    LISTEN      117        16756       1316/java  

當我停止--all.sh

    hduser@sepdau:~$ stop-all.sh
no jobtracker to stop
localhost: no tasktracker to stop
no namenode to stop
localhost: no datanode to stop
localhost: no secondarynamenode to stop

在我的主機文件中

hduser@sepdau:~$ cat /etc/hosts

127.0.0.1       localhost
127.0.1.1   sepdau.com



# The following lines are desirable for IPv6 capable hosts
::1     ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters

file slave:localhost master:localhost

這是一些日志

    hduser@sepdau:/home/sepdau$ start-all.sh
mkdir: cannot create directory `/var/run/hadoop': Permission denied
starting namenode, logging to /var/log/hadoop/hduser/hadoop-hduser-namenode-sepdau.com.out
/usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-hduser-namenode.pid: No such file or directory
localhost: mkdir: cannot create directory `/var/run/hadoop': Permission denied
localhost: starting datanode, logging to /var/log/hadoop/hduser/hadoop-hduser-datanode-sepdau.com.out
localhost: /usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-hduser-datanode.pid: No such file or directory
localhost: mkdir: cannot create directory `/var/run/hadoop': Permission denied
localhost: starting secondarynamenode, logging to /var/log/hadoop/hduser/hadoop-hduser-secondarynamenode-sepdau.com.out
localhost: /usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-hduser-secondarynamenode.pid: No such file or directory
mkdir: cannot create directory `/var/run/hadoop': Permission denied
starting jobtracker, logging to /var/log/hadoop/hduser/hadoop-hduser-jobtracker-sepdau.com.out
/usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-hduser-jobtracker.pid: No such file or directory
localhost: mkdir: cannot create directory `/var/run/hadoop': Permission denied
localhost: starting tasktracker, logging to /var/log/hadoop/hduser/hadoop-hduser-tasktracker-sepdau.com.out
localhost: /usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-hduser-tasktracker.pid: No such file or directory

我使用root用戶,但它有同樣的問題

我在這里錯了。 如何使用hadoop插件連接到eclipse。 謝謝你的進步

嘗試添加

<property>
  <name>dfs.name.dir</name>
   <value>/home/abhinav/hdfs</value>
 </property>

到hdfs-site.xml並確保它存在

我為此寫了一個小教程。 看看這是否有幫助http://blog.abhinavmathur.net/2013/01/experience-with-setting-multinode.html

您可以通過編輯文件hadoop-env.sh來添加pid和創建日志的路徑。 該文件存儲在conf文件夾中。

export HADOOP_LOG_DIR=/home/username/hadoop-1x/logs

export HADOOP_PID_DIR=/home/username/pids

修改您的hdfs-site.xml

<property>
  <name>dfs.name.dir</name>
  <value>/home/user_to_run_hadoop/hdfs/name</value>
</property>

<property>
  <name>dfs.data.dir</name>
  <value>/home/user_to_run_hadoop/hdfs/data</value>
</property>

確保在/home/user_to_run_hadoop創建目錄hdfs 然后在hdfs上創建2個目錄namedata

之后你需要chmod -R 755 ./hdfs/path_to_hadoop_home/bin/hadoop namenode -format

重新啟動終端,首先格式化NameNode。

一些罕見的情況有人更改了Hadoop中Bin文件夾中的Start-all.sh文件。 檢查一次。

檢查一次bashrc文件配置好不好?

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM