简体   繁体   English

NameNode:java.net.BindException

[英]NameNode: java.net.BindException

hi folks i am stucked in very strange problem.I am installing HBase and hadoop on another VM by accessing it from my machine.Now i have properly installed hadoop and then iran it ./start-all.sh and i see that all processes are running perfectly.So i do jps and i saw that 嗨,大家好,我陷入了一个非常奇怪的问题。我正在通过从我的机器访问HBase和hadoop在另一台VM上安装它。现在我已经正确安装了hadoop然后运行它./start-all.sh,我发现所有进程都在运行完美。所以我做jps,我看到了
jobtracker 工作追踪器
tasktracker 任务跟踪器
namenode 名字节点
secondrynamenode secondrynamenode
datanode 数据节点

everything is running good.Now when I setup hbase and then started hadoop and Hbase , I saw that namenode is not running and in logs (from namenode log file) I got this exception 一切运行良好。现在,当我设置hbase然后启动hadoop和Hbase时,我看到namenode没有运行,并且在日志中(来自namenode日志文件)我得到了这个异常

java.lang.InterruptedException: sleep interrupted
    at java.lang.Thread.sleep(Native Method)
    at org.apache.hadoop.hdfs.server.namenode.DecommissionManager$Monitor.run(DecommissionManager.java:65)
    at java.lang.Thread.run(Thread.java:662)
2012-05-19 08:46:07,493 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of transactions: 0 Total time for transactions(ms): 0Number of transactions batched in Syncs: 0 Number of syncs: 0 SyncTimes(ms): 0 
2012-05-19 08:46:07,516 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: java.net.BindException: Problem binding to localhost/23.21.195.24:54310 : Cannot assign requested address
    at org.apache.hadoop.ipc.Server.bind(Server.java:227)
    at org.apache.hadoop.ipc.Server$Listener.<init>(Server.java:301)
    at org.apache.hadoop.ipc.Server.<init>(Server.java:1483)
    at org.apache.hadoop.ipc.RPC$Server.<init>(RPC.java:545)
    at org.apache.hadoop.ipc.RPC.getServer(RPC.java:506)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:294)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:497)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1268)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1277)
Caused by: java.net.BindException: Cannot assign requested address
    at sun.nio.ch.Net.bind(Native Method)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:126)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:59)
    at org.apache.hadoop.ipc.Server.bind(Server.java:225)
    ... 8 more

2012-05-19 08:46:07,516 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG: 



i checked ports and revise all conf files again and again but didn't find the solution. 我检查了端口并一次又一次地修改了所有的conf文件,但是没有找到解决方案。 Please guide me if anyone have an idea- 如果有人有想法,请指导我-
Thnaks alot 想念很多

Based on your comment, you're probably is most probably related to the hosts file. 根据您的评论,您很可能与hosts文件有关。

Firstly you should uncomment the 127.0.0.1 localhost entry, this is a fundamental entry. 首先,您应该取消注释127.0.0.1 localhost条目,这是一个基本条目。

Secondly, Have you set up hadoop and hbase to run with external accessible services - i'm not too up on hbase, but for hadoop, the services need to be bound to non-localhost addresses for external access, so your masters and slaves files in $HADOOP_HOME/conf need to name the actual machine names (or IP addresses if you don't have a DNS server). 其次,您是否将hadoop和hbase设置为与外部可访问服务一起运行-我不是在hbase上过分,但是对于hadoop,这些服务需要绑定到非本地主机地址以进行外部访问,因此您的主文件和从文件$ HADOOP_HOME / conf中的名称需要命名实际的计算机名称(或IP地址,如果没有DNS服务器)。 None of your configuration files should mention localhost, and should use either the host names or IP addresses. 您的所有配置文件都不应提及localhost,并且应使用主机名或IP地址。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM