简体   繁体   English

Hadoop:如何使用Java应用程序写入HDFS

[英]Hadoop: How to write to HDFS using a Java application

I am new to Hadoop and trying to learn. 我是Hadoop的新手,正在尝试学习。 I am trying to run the below Hadoop sample code in Eclipse on Ubuntu Linux. 我正在尝试在Ubuntu Linux上的Eclipse中运行以下Hadoop示例代码。 I have Hadoop v 2.7.0 and I have the required jars. 我具有Hadoop v 2.7.0,并且具有必需的jar。

    Configuration conf = new Configuration();
    conf.addResource(new Path("/usr/local/hadoop/conf/core-site.xml")); 
    Path pt=new Path("hdfs://localhost:9000/myhome/a.txt");
    FileSystem fs = FileSystem.get(conf);

When I run the application in Eclipse I get Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/PlatformName . 当我在Eclipse中运行应用程序时, Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/PlatformName得到Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/PlatformName The hadoop-common-2.7.0.jar file I am referencing does not contain the class application is looking for. 我正在引用的hadoop-common-2.7.0.jar文件不包含应用程序正在寻找的类。 I am referencing that jar file Hadoop/common folder. 我引用的是jar文件Hadoop / common文件夹。

Any help is resolving this issue will be much appreciated. 解决该问题的任何帮助将不胜感激。

If I create a jar file of the class for above above code and run it using hadoop -jar <jar file> <class name> , it works. 如果我为上述代码创建了该类的jar文件,并使用hadoop -jar <jar file> <class name>对其进行了运行,那么它将起作用。 So I am wondering whether it's possible at all to run a Hadoop Java application from Eclipse or command line without using hadoop command. 因此,我想知道是否可以在不使用hadoop命令的情况下从Eclipse或命令行运行Hadoop Java应用程序。

It seems that the JVM doesn't load all required Hadoop artifacts. 看来JVM不会加载所有必需的Hadoop工件。

If you are a maven user, please ensure that you have these dependencies. 如果您是maven用户,请确保您具有这些依赖项。

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>${hadoop.client.version}</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>${hadoop.client.version}</version>
</dependency>

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM