简体   繁体   English

NameNode启动失败

[英]Failed to start NameNode

I've successfully installed hadoop on local by following these steps Step by step hadoop installation on windows 10我已按照以下步骤在本地成功安装 hadoop 在 windows 10 上逐步安装 hadoop

Java installed version: 1.8.0_231 Hadoop installed version: Hadoop 3.2.1 Java 安装版本:1.8.0_231 Hadoop 安装版本:Hadoop 3.2.1

But after running command hdfs namenode -format I'm getting following error但是在运行命令hdfs namenode -format后,我收到以下错误

Re-format filesystem in Storage Directory root= C:\hadoop-3.2.1\data\namenode; location= null ? (Y or N) y
2019-10-19 12:34:58,809 INFO namenode.FSImage: Allocated new BlockPoolId: BP-1445655329-172.18.148.177-1571468698797
2019-10-19 12:34:58,809 INFO common.Storage: Will remove files: []
2019-10-19 12:34:58,812 ERROR namenode.NameNode: Failed to start namenode.java.lang.UnsupportedOperationException
    at java.nio.file.Files.setPosixFilePermissions(Files.java:2044)
    at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:452)
    at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:591)
    at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:613)
    at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:188)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1206)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1649)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1759)
2019-10-19 12:34:58,819 INFO util.ExitUtil: Exiting with status 1: java.lang.UnsupportedOperationException
2019-10-19 12:34:58,823 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at CWT-DST-0051/***.**.***.***

I'm totally new to this, what can be the issue?我对此完全陌生,可能是什么问题? Thanks in advance提前致谢

I resolved the issue by installing Hadoop 2.9.1, there was namenode issue in Hadoop 3.2.1 version hdfs namenode issue in 3.2.1我通过安装 Hadoop 2.9.1 解决了这个问题,Hadoop 3.2.1 版本中存在名称节点问题 hdfs名称节点问题在 3.2.1 中

  1. If you have installed 32-bit Java version in your windows, uou have to set environtment variables JAVA_HOME path set as C:\Progra~2\Java\<JDK version> but it seems to not working. If you have installed 32-bit Java version in your windows, uou have to set environtment variables JAVA_HOME path set as C:\Progra~2\Java\<JDK version> but it seems to not working.
  • Therefor you have to try 64-bit java version by setting you JAVA_HOME as C:\Progra~1\Java\<JDK version> .因此,您必须通过将JAVA_HOMEC:\Progra~1\Java\<JDK version>来尝试64 位java 版本。

After setting 64-bit Java version as JAVA_HOME you have to run start-all.cmd again in cmd .将 64 位 Java 版本设置为JAVA_HOME后,您必须在 cmd 中再次运行start-all.cmd cmd Then except namenode , all other deamons were worked.然后除了namenode ,所有其他的守护进程都工作了。 To run namenode you have to follow these steps.要运行namenode ,您必须按照以下步骤操作。

  1. Open cmd as administrator.以管理员身份打开cmd
  2. Type and run stop-all.cmd键入并运行stop-all.cmd
  3. Then run hadoop namenode –format然后运行hadoop namenode –format
  4. Finally run start-all.cmd最后运行start-all.cmd

Hope it will work for you.希望它对你有用。

Edit-:编辑-:

Go to your config files of hdfs with following codes (enter it in cmd or powershell) Go 到您的 hdfs 的配置文件中,使用以下代码(在 cmd 或 powershell 中输入)

vim %HADOOP_HOME%\etc\hadoop\hdfs-site.xml vim %HADOOP_HOME%\etc\hadoop\hdfs-site.xml

In your config files, you need to use forward slashes and a protocol for the file URI在您的配置文件中,您需要使用正斜杠和文件 URI 协议

For example, change例如,改变

 C:\BigData\hadoop-2.9.1\data\namenode

to

file:/C:/BigData/hadoop-2.9.1/data/namenode

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM