简体   繁体   中英

Hadoop: How to write to HDFS using a Java application

I am new to Hadoop and trying to learn. I am trying to run the below Hadoop sample code in Eclipse on Ubuntu Linux. I have Hadoop v 2.7.0 and I have the required jars.

    Configuration conf = new Configuration();
    conf.addResource(new Path("/usr/local/hadoop/conf/core-site.xml")); 
    Path pt=new Path("hdfs://localhost:9000/myhome/a.txt");
    FileSystem fs = FileSystem.get(conf);

When I run the application in Eclipse I get Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/PlatformName . The hadoop-common-2.7.0.jar file I am referencing does not contain the class application is looking for. I am referencing that jar file Hadoop/common folder.

Any help is resolving this issue will be much appreciated.

If I create a jar file of the class for above above code and run it using hadoop -jar <jar file> <class name> , it works. So I am wondering whether it's possible at all to run a Hadoop Java application from Eclipse or command line without using hadoop command.

It seems that the JVM doesn't load all required Hadoop artifacts.

If you are a maven user, please ensure that you have these dependencies.

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>${hadoop.client.version}</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>${hadoop.client.version}</version>
</dependency>

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM