简体   繁体   中英

Accessing hadoop distributed file system from eclipse using java

Here is the code for accessing the HDFS using java

         try {

            Configuration config = new Configuration();
            config.set("fs.defaultFS","hdfs://192.168.28.153:9000/");
            FileSystem dfs = FileSystem.get(config);
             Path pt = new Path("hdfs://192.168.28.153:9000/user/hduser/wordcountinput/input.txt");
             config.addResource(new Path("/usr/local/hadoop/conf/core-site.xml"));
             BufferedReader br = new BufferedReader(new InputStreamReader(dfs.open(pt)));
             String line;
             line = br.readLine();
             while ((line = br.readLine()) != null) {
                 System.out.println(line);
                 line = br.readLine();
             }

And after executing i am getting the followed exception:

WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please use org.apache.hadoop.log.metrics.EventCounter in all the
 log4j.properties files. No FileSystem for scheme:
 hdfsjava.io.IOException: No FileSystem for scheme: hdfs
 at
 org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2138)
 org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2145)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:80) 

Is there is any connection issue? I need help on this so that i can proceed further. Is any resource is missing or something else.

Just use the specific jars. There are bunches of inappropriate jars.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM