简体   繁体   English

删除Hadoop文件系统(HDFS)中的文件

[英]Delete files in Hadoop file system (hdfs)

I left a sqoop job running and it completely filled the hdfs(100%). 我离开了一个sqoop作业,它完全填满了hdfs(100%)。 Now i cannot delete the files in Hdfs. 现在,我无法删除Hdfs中的文件。 It is giving me an execption 这给了我一个惊喜

    # hdfs dfs -rm -skipTrash /TEST_FILE 
    rmr: Cannot delete /TEST_FILE. Name node is in safe mode.

I used hdfs dfsadmin -safemode to leave out of safe mode. 我使用hdfs dfsadmin -safemode退出了安全模式。

    [hdfs@sandbox /]$ hdfs dfsadmin -safemode leave
    Safe mode is OFF 

But when i try again deleting the file using hdfs dfs -rm -skipTrash /TEST_FILE iam getting error message that NameNode is in safemode 但是当我再次尝试使用hdfs dfs -rm -skipTrash /TEST_FILE删除文件时,IAM收到错误消息,表明NameNode is in safemode

I am unable to leave safemode and delete the file. 我无法离开安全模式并删除文件。

can anyone help me how to get out of safemode and delete the file? 谁能帮我摆脱安全模式并删除文件?

Namenode cannot be brought out of safe mode with the command "hdfs dfsadmin -safemode leave" unless you are really cleaning up something . 除非确实要清理某些内容,否则无法使用命令“ hdfs dfsadmin -safemode离开”使Namenode退出安全模式。 if you dont do anything and use the command , the name node will go back to safe mode . 如果您不执行任何操作并使用命令,则名称节点将返回安全模式。 Since you are using the single node cluster , you can try cleaning up your log directories /var/log/ ... and bring the name node out of safe mode and delete the hdfs directory left by sqoop 由于您使用的是单节点集群,因此可以尝试清理日志目录/ var / log / ...并使名称节点退出安全模式,并删除sqoop留下的hdfs目录

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM