简体   繁体   中英

Programmatically reading contents of text file stored in HDFS using Java

How do I run this simple Java program to read bytes from a text file stored in directory/words in HDFS? Do I need to create a jar file for the purpose?

import java.io.*;
import java.net.MalformedURLException;
import java.net.URL;
import org.apache.hadoop.*;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

public class filesystemhdfs 
{
    public static void main(String args[]) throws MalformedURLException, IOException
    {
        byte[] b=null;
        InputStream in=null;
        in=new URL("hdfs://localhost/words/file").openStream();
        in.read(b);
        System.out.println(""+b);
        for(int i=0;i<b.length;i++)
        {
            System.out.println("b[i]=%d"+b[i]);
            System.out.println(""+(char)b[i]);
        }
    }
}

You can use the HDFS API, this can be run from local.:

Configuration configuration = new Configuration();
        configuration.set("fs.defaultFS", "hdfs://namenode:8020");
        FileSystem fs = FileSystem.get(configuration);
Path filePath = new Path(
                "hdfs://namenode:8020/PATH");

        FSDataInputStream fsDataInputStream = fs.open(filePath);

First, you need to tell the JVM about the HDFS scheme in the URLs objects. This is done via:

URL.setURLStreamHandlerFactory(new FsUrlStreamHandlerFactory());

After compiling your Java class, you need to use hadoop command:

hadoop filesystemhdfs

Hadoop comes with a convenient IOUtils . It will ease a lot of stuff for you.

You can not read a file from HDFS , as a regular filesystem java supports. You need to use HDFS java AP I for this.

public static void main(String a[]) {
     UserGroupInformation ugi
     = UserGroupInformation.createRemoteUser("root");

     try {


        ugi.doAs(new PrivilegedExceptionAction<Void>() {

            public Void run() throws Exception {

               Configuration conf = new Configuration();
                    //fs.default.name should match the corresponding value 
                    // in your core-site.xml in hadoop cluster
                conf.set("fs.default.name","hdfs://hostname:9000");
                conf.set("hadoop.job.ugi", "root");

                 readFile("words/file",conf) 

                return null;
            }
        });

    } catch (Exception e) {
        e.printStackTrace();
    }

}

 public static void readFile(String file,Configuration conf) throws IOException {
    FileSystem fileSystem = FileSystem.get(conf);

    Path path = new Path(file);
    if (!ifExists(path)) {
        System.out.println("File " + file + " does not exists");
        return;
    }

    FSDataInputStream in = fileSystem.open(path);

    BufferedReader br = new BufferedReader(new InputStreamReader(in));
    String line = null;
    while((line = br.readLine())!= null){
        System.out.println(line);
    }
    in.close();
    br.close();
    fileSystem.close();
 }
   public static boolean ifExists(Path source) throws IOException {

    FileSystem hdfs = FileSystem.get(conf);
    boolean isExists = hdfs.exists(source);
    System.out.println(isExists);
    return isExists;
 }

Here I am trying from a remote machine, that's why I am using UserGroupInformation and write code in the run method of PrivilegedExceptionAction . If you are in the local system you may not need it. HTH!

Its a bit late to reply, but it will help future reader. It will iterate your HDFS directory and will read the content of each file.

Hadoop client and Java is used only.

Configuration conf = new Configuration();
            conf.addResource(new Path(“/your/hadoop/conf/core-site.xml"));
            conf.addResource(new Path("/your/hadoop/confhdfs-site.xml"));
            FileSystem fs = FileSystem.get(conf);
            FileStatus[] status = fs.listStatus(new Path("hdfs://path/to/your/hdfs/directory”);
            for (int i = 0; i < status.length; i++) {
                FSDataInputStream inputStream = fs.open(status[i].getPath());
                String content = IOUtils.toString(inputStream, "UTF-8");
            }

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM