简体   繁体   English

从java中删除hdfs文件夹

[英]Delete hdfs folder from java

In a java app running on an edge node, I need to delete a hdfs folder, if it exists. 在边缘节点上运行的Java应用程序中,我需要删除hdfs文件夹(如果存在)。 I need to do that before running a mapreduce job (with spark) that output in the folder. 我需要在运行在文件夹中输出的mapreduce作业(带有spark)之前执行此操作。

I found I could use the method 我发现我可以使用这种方法

org.apache.hadoop.fs.FileUtil.fullyDelete(new File(url))

However, I can only make it work with local folder (ie file url on the running computer). 但是,我只能使用本地文件夹(即正在运行的计算机上的文件URL)。 I tried to use something like: 我尝试使用类似的东西:

url = "hdfs://hdfshost:port/the/folder/to/delete";

with hdfs://hdfshost:port being the hdfs namenode IPC. 使用hdfs://hdfshost:port是hdfs namenode IPC。 I use it for the mapreduce, so it is correct. 我用它来mapreduce,所以它是正确的。 However it doesn't do anything. 但它没有做任何事情。

So, what url should I use, or is there another method? 那么,我应该使用什么网址,还是有其他方法?

Note: here is the simple project in question. 注意: 是一个简单的项目。

This works for me. 这适合我。

Just add the following codes in my WordCount program will do: 只需在我的WordCount程序中添加以下代码即可:

import org.apache.hadoop.fs.*;

...
Configuration conf = new Configuration();

Path output = new Path("/the/folder/to/delete");
FileSystem hdfs = FileSystem.get(URI.create("hdfs://namenode:port"),conf);

// delete existing directory
if (hdfs.exists(output)) {
    hdfs.delete(output, true);
}

Job job = Job.getInstance(conf, "word count");
...

You need to add hdfs://hdfshost:port explicitly to get distributed file system. 您需要显式添加hdfs://hdfshost:port以获取分布式文件系统。 Else the code will work for local file system only. 否则,代码仅适用于本地文件系统。

I do it this way: 我是这样做的:

    Configuration conf = new Configuration();
    conf.set("fs.hdfs.impl",org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
    conf.set("fs.file.impl",org.apache.hadoop.fs.LocalFileSystem.class.getName());
    FileSystem  hdfs = FileSystem.get(URI.create("hdfs://<namenode-hostname>:<port>"), conf);
    hdfs.delete("/path/to/your/file", isRecursive);

you don't need hdfs://hdfshost:port/ in your file path 你的文件路径中不需要hdfs://hdfshost:port/

if you need to delete all files in the directory: 如果您需要删除目录中的所有文件:

1) check how many files are there in your directory. 1)检查目录中有多少文件。

2) later delete all of them 2)稍后删除所有这些

     public void delete_archivos_dedirectorio() throws IOException {

//namenode= hdfs://ip + ":" + puerto 

            Path directorio = new Path(namenode + "//test//"); //nos situamos en la ruta//
            FileStatus[] fileStatus = hdfsFileSystem.listStatus(directorio); //listamos los archivos que hay actualmente en ese directorio antes de hacer nada
            int archivos_basura =  fileStatus.length; //vemos cuandoarchivos hay en el directorio antes de hacer nada, y luego iteramos hasta el nuemro de archivos que haya y llos vamos borrando para luego ir crandolos de nuevo en el writte.


            for (int numero = 0; numero <= archivos_basura ; numero++) {

                Path archivo = new Path(namenode + "//test//" + numero + ".txt");

                try {

                    if(hdfsFileSystem.exists(archivo)) {

                        try {
                            hdfsFileSystem.delete(archivo, true);
                        } catch (IOException ex) {
                            System.out.println(ex.getMessage());
                        }
                    }
                } catch (IOException e) {
                    e.printStackTrace();
                }
            }
        }

good luck :) 祝好运 :)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM