简体   繁体   English

Hadoop / Eclipse - 线程“main”中的异常java.lang.NoClassDefFoundError:org / apache / hadoop / fs / FileSystem

[英]Hadoop/Eclipse - Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FileSystem

I'm trying to run the PutMerge program from Hadoop in Action by Chuck Lam from Manning Publishing. 我正试图从Manning Publishing的Chuck Lam手中的Hadoop in Action中运行PutMerge程序。 It should be pretty simple, but I've had a bunch of problems trying to run it, and I've gotten to this error that I just can't figure out. 它应该很简单,但是我尝试运行它时遇到了一些问题,而且我已经遇到了这个我无法弄清楚的错误。 Meanwhile, I'm running a basic wordcount program with no problem. 与此同时,我正在运行一个基本的wordcount程序没有问题。 I've spent about 3 days on this now. 我现在花了大约3天时间。 I've done all the research I possibly can on this, and I'm just lost. 我已经完成了我可能做的所有研究,而我只是输了。

Ya'll have any ideas? 你会有什么想法吗?

Program: 程序:

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;


public class PutMerge {

    public static void main(String[] args) throws IOException {
        Configuration conf = new Configuration();

        FileSystem hdfs = FileSystem.get(conf);
        FileSystem local = FileSystem.getLocal(conf);

        Path inputDir = new Path(args[0]);
        Path hdfsFile = new Path(args[1]);


        try{
            FileStatus[] inputFiles = local.listStatus(inputDir);
            FSDataOutputStream out = hdfs.create(hdfsFile);

            for (int i=0; i<=inputFiles.length; i++){
                System.out.println(inputFiles[i].getPath().getName());
                FSDataInputStream in = local.open(inputFiles[i].getPath());

                byte buffer[] = new byte[256];
                int bytesRead = 0;

                while( (bytesRead = in.read(buffer)) > 0) {
                    out.write(buffer, 0, bytesRead);
                }

                in.close();

            }

            out.close();

        } catch(IOException e){

            e.printStackTrace();

        }

    }

}

Output Error from Eclipse: Eclipse的输出错误:

    2015-04-09 19:45:48,321 WARN  util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FileSystem
    at java.lang.ClassLoader.findBootstrapClass(Native Method)
    at java.lang.ClassLoader.findBootstrapClassOrNull(ClassLoader.java:1012)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:413)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:344)
    at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
    at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
    at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
    at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2563)
    at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2574)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
    at PutMerge.main(PutMerge.java:16)

About Eclipse: 关于Eclipse:

Eclipse IDE for Java Developers
Version: Luna Service Release 2 (4.4.2)
Build id: 20150219-0600

Eclipse安装细节

About Hadooop: 关于Hadooop:

Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /usr/local/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar

About Java: 关于Java:

java version "1.8.0_31"
Java(TM) SE Runtime Environment (build 1.8.0_31-b13)
Java HotSpot(TM) 64-Bit Server VM (build 25.31-b07, mixed mode)  

About my machine: 关于我的机器:

Mac OSX 10.9.5

Java Build Path - External JARs in Library: Java构建路径 - 库中的外部JAR:

Hadoop常见

My experience with Eclipse IDE : 我使用Eclipse IDE的经验: 从Package Explorer

My basic path for ubuntu installation is usr/hadoop/hadoop-2.7.1 (lets' say CONF) I've added two jar files,from CONF/share/hadoop/common/lib and from CONF/share/hadoop/common. 我的ubuntu安装的基本路径是usr / hadoop / hadoop-2.7.1(让我们说'CONF)我已经从CONF / share / hadoop / common / lib和CONF / share / hadoop / common添加了两个jar文件。 And this is the java code (from the book Hadoop in Action) : 这是java代码(来自Hadoop in Action一书):

import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;


public class PutMerge {


public static void main(String[] args) throws IOException {
        Configuration conf = new Configuration();

        conf.set("fs.file.impl",org.apache.hadoop.fs.LocalFileSystem.class.getName());

        org.apache.hadoop.fs.FileSystem hdfs = org.apache.hadoop.fs.FileSystem.get(conf);
        FileSystem local = org.apache.hadoop.fs.FileSystem.getLocal(conf);
        Path inputDir = new Path(args[0]);
        Path hdfsFile = new Path(args[1]);
        try {
            FileStatus[] inputFiles = local.listStatus(inputDir);
            FSDataOutputStream out = hdfs.create(hdfsFile);
            for (int i=0; i<inputFiles.length; i++) {
                System.out.println(inputFiles[i].getPath().getName());
                FSDataInputStream in = local.open(inputFiles[i].getPath());
                byte buffer[] = new byte[256];
                int bytesRead = 0;
                while( (bytesRead = in.read(buffer)) > 0) {
                    out.write(buffer, 0, bytesRead);
                }
                in.close();
            }
            out.close();
        } catch (IOException e) {
            e.printStackTrace();
        }
    }
}

The solution for me was to export the .jar file from this code, and this what I did : Right click on PutMerge project, then export (from the pop-up menu) : 我的解决方案是从这段代码导出.jar文件,这就是我所做的:右键单击PutMerge项目,然后导出(从弹出菜单):

始终从Package Explorer

and saved the jar file in a folder named PutMerge on home/hduser directory 并将jar文件保存在home / hduser目录下名为PutMerge的文件夹中 来自filesystem,PutMerge.jar

In another folder named input (path /home/hduser/input) there are three .txt files as input for PutMerge procedure : 在另一个名为input(path / home / hduser / input)的文件夹中,有三个.txt文件作为PutMerge过程的输入: 三个输入文件

And now we are ready to launch the command from a terminal session : hadoop jar /home/hduser/PutMerge/PutMerge.jar PutMerge /home/hduser/input output4/all 现在我们准备从终端会话启动命令:hadoop jar /home/hduser/PutMerge/PutMerge.jar PutMerge / home / hduser / input output4 / all

and the command /usr/hadoop/hadoop-2.7.1$ hdfs dfs -cat /output4/all 和/usr/hadoop/hadoop-2.7.1$命令hdfs dfs -cat / output4 / all

will contain all the text of the three single files. 将包含三个单个文件的所有文本。

put like this in your code 在你的代码中这样写

Configuration configuration = new Configuration(); 配置配置= new Configuration(); configuration.set("fs.hdfs.impl",org.apache.hadoop.hdfs.DistributedFileSystem.class.getName()); configuration.set( “fs.hdfs.impl”,org.apache.hadoop.hdfs.DistributedFileSystem.class.getName()); configuration.set("fs.file.impl",org.apache.hadoop.fs.LocalFileSystem.class.getName()); configuration.set( “fs.file.impl”,org.apache.hadoop.fs.LocalFileSystem.class.getName());

I had this problem when my maven repository contained corrupted JAR files. 当我的maven存储库包含损坏的JAR文件时,我遇到了这个问题。 Same as you I could see the hadoop-common-xxxjar existed in eclipse when viewing the "Maven Dependencies" of my Java project. 和我一样,在查看我的Java项目的“Maven Dependencies”时,我可以看到eclipse中存在hadoop-common-xxxjar。 However when expanding the JAR file in eclipse and selecting the class named org.apache.hadoop.fs.FSDataInputStream eclipse was reporting a message something like "Invalid LOC header". 但是,当在eclipse中扩展JAR文件并选择名为org.apache.hadoop.fs.FSDataInputStream的类时,eclipse报告的消息类似于“Invalid LOC header”。

Deleting all files from my local maven repository and executing mvn install again resolved my issue 从我的本地maven存储库中删除所有文件并再次执行mvn install解决了我的问题

If you are using the configuration to run your app for debugging. 如果您使用配置来运行应用程序以进行调试。 Make sure you have the checkbox checked for Include Dependencies with Provided Scope if you have any of the dependencies and you have mentioned its scope to provided. 如果您有任何依赖项并且已经提到了它的范围,请确保选中“ 包含依赖项与提供的范围 ”复选框。 It worked for me by following this approach 按照这种方法,它对我有用

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 线程“ main”中的异常java.lang.NoClassDefFoundError:org / apache / hadoop / fs / FSDataInputStrea - Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStrea SSH 线程“主”中的异常 java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream - SSH Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream 线程“main”中的异常java.lang.NoClassDefFoundError:org / apache / hadoop / hbase / HBaseConfiguration - Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration 线程“main”中的异常java.lang.NoClassDefFoundError:org / apache / hadoop / util / PlatformName - Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/util/PlatformName 线程“ main”中的异常java.lang.NoClassDefFoundError:org / apache / hadoop / mapreduce / RecordReader - Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/RecordReader 线程“main”中的异常 java.lang.NoClassDefFoundError: org/apache/hadoop/tracing/SpanReceiverHost - Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/tracing/SpanReceiverHost Hadoop Java 错误:线程“main”中的异常 java.lang.NoClassDefFoundError: WordCount(错误名称:org/myorg/WordCount) - Hadoop Java Error : Exception in thread “main” java.lang.NoClassDefFoundError: WordCount (wrong name: org/myorg/WordCount) Hadoop:java.lang.Exception:java.lang.NoClassDefFoundError:org / apache / xerces / parsers / AbstractSAXParser - Hadoop: java.lang.Exception: java.lang.NoClassDefFoundError: org/apache/xerces/parsers/AbstractSAXParser 尝试在数据阶段 11.7 中写入镶木地板文件时出错(File_Connector_20,0: java.lang.NoClassDefFoundError: org.apache.Z0238775C7BD96E2EAB9803) - Error while trying to write on parquet file in datastage 11.7 (File_Connector_20,0: java.lang.NoClassDefFoundError: org.apache.hadoop.fs.FileSystem) sqoop:java.lang.NoClassDefFoundError:org / apache / hadoop / mapreduce / InputFormat - sqoop: java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/InputFormat
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM