[英]Hadoop ClassNotFoundException
I'm writing my 1st Hadoop application and I'm getting an error. 我正在编写我的第一个Hadoop应用程序而且我收到了一个错误。 I don't quite understand what some of the detials in this stack trace mean.
我不太明白这个堆栈跟踪中的一些detials是什么意思。 It's a
ClassNotFoundException
. 这是一个
ClassNotFoundException
。 I'm building this on Ubuntu Linux v12.10, Eclipse 3.8.0, Java 1.6.0_24. 我正在Ubuntu Linux v12.10,Eclipse 3.8.0,Java 1.6.0_24上构建它。 I installed Hadoop by downloading it off the Apache site and building it with Ant.
我通过从Apache站点下载并使用Ant构建Hadoop来安装Hadoop。
My crash is on the 1st line of the program when I'm creating a job. 当我创建一份工作时,我的崩溃就在程序的第一行。
public static void main(String[] args) throws IOException, InterruptedException, ClassNotFoundException {
Job job = new Job(); <<== crashing here.
Program [Java Application]
com.sandbox.hadoop.Program at localhost:33878
Thread [main] (Suspended (exception ClassNotFoundException))
owns: Launcher$AppClassLoader (id=29)
owns: Class<T> (org.apache.hadoop.security.UserGroupInformation) (id=25)
URLClassLoader$1.run() line: 217
AccessController.doPrivileged(PrivilegedExceptionAction<T>, AccessControlContext) line: not available [native method]
Launcher$AppClassLoader(URLClassLoader).findClass(String) line: 205
Launcher$AppClassLoader(ClassLoader).loadClass(String, boolean) line: 321
Launcher$AppClassLoader.loadClass(String, boolean) line: 294
Launcher$AppClassLoader(ClassLoader).loadClass(String) line: 266
DefaultMetricsSystem.<init>() line: 37
DefaultMetricsSystem.<clinit>() line: 34
UgiInstrumentation.create(Configuration) line: 51
UserGroupInformation.initialize(Configuration) line: 216
UserGroupInformation.ensureInitialized() line: 184
UserGroupInformation.isSecurityEnabled() line: 236
KerberosName.<clinit>() line: 79
UserGroupInformation.initialize(Configuration) line: 209
UserGroupInformation.ensureInitialized() line: 184
UserGroupInformation.isSecurityEnabled() line: 236
UserGroupInformation.getLoginUser() line: 477
UserGroupInformation.getCurrentUser() line: 463
Job(JobContext).<init>(Configuration, JobID) line: 80
Job.<init>(Configuration) line: 50
Job.<init>() line: 46
Program.main(String[]) line: 17
/usr/lib/jvm/java-6-openjdk-amd64/bin/java (Jan 14, 2013 2:42:36 PM)
Console Output: 控制台输出:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/configuration/Configuration
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:37)
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:34)
at org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:51)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:216)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:184)
at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:236)
at org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:79)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:209)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:184)
at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:236)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:477)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
at org.apache.hadoop.mapreduce.JobContext.<init>(JobContext.java:80)
at org.apache.hadoop.mapreduce.Job.<init>(Job.java:50)
at org.apache.hadoop.mapreduce.Job.<init>(Job.java:46)
at com.sandbox.hadoop.Program.main(Program.java:18)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration.Configuration
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
... 16 more
You should add all the jars found in /usr/lib/hadoop-0.xx/lib
to avoid this kind of classpath issues. 您应该添加
/usr/lib/hadoop-0.xx/lib
找到的所有jar,以避免出现这种类路径问题。
To give you an idea, you can type hadoop classpath
which will print you the class path needed to get the Hadoop jar and the required libraries. 为了给你一个想法,你可以输入
hadoop classpath
,它会打印出获取Hadoop jar和所需库所需的类路径。
In your case, you're missing the hadoop-common-0.xx.jar
, so you should add this to the classpath and you should be good to go. 在你的情况下,你错过了
hadoop-common-0.xx.jar
,所以你应该把它添加到类路径中,你应该好好去。
Does your main program need org.apache.commons.configuration.Configuration
or should this be org.apache.hadoop.conf.Configuration
? 你的主程序需要
org.apache.commons.configuration.Configuration
还是应该是org.apache.hadoop.conf.Configuration
?
Looks like Eclipse has auto-imported the wrong Configuration class, which isn't on the classpath when hadoop runs on your cluster. 看起来Eclipse已经自动导入了错误的Configuration类,当您的集群上运行hadoop时,该类不在类路径上。
Can you share your source code, in particular the com.sandbox.hadoop.Program
class, main
method? 你能分享你的源代码,特别是
com.sandbox.hadoop.Program
类, main
方法吗?
我遇到了同样的问题。我通过将commons-configuration-xxjar添加到我的构建路径来解决它。它位于$ HADOOP_HOME / lib下。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.