简体   繁体   English

hadoop类未找到异常,即使它在hadoop类路径中

[英]hadoop class not found exception even though its there in hadoop classpath

Just installed hadoop and have a simple program below which I found online to display the configurations 刚刚安装了hadoop,并有一个简单的程序,在下面我可以在线找到该程序以显示配置

import java.util.*;
import org.apache.hadoop.conf.*;

class printHadoop {

  public static void main(String args[]) {
    Configuration conf = new Configuration();
    if ( args.length == 1)
      conf.addResource(args[0]);
    Iterator it = conf.iterator();
    while (it.hasNext()) {
      System.out.println(it.next());
    }
  }
}

I compiled it and tried to run it through hadoop 我编译了它并尝试通过hadoop运行它

$hadoop -cp . printHadoop

It throws a classnotfound error , its not able to find the org.apache.hadoop.conf.* which are in hadoop folder 它抛出classnotfound错误,无法在hadoop文件夹中找到org.apache.hadoop.conf。*。

But when I look at the classpath for hadoop 但是当我查看hadoop的类路径时

$hadoop classpath $ hadoop classpath

It does include the jars that contain the required packages. 它确实包括包含所需包装的罐子。

/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.4.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.4.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.4.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.4.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar

Any Idea why the classnotfound exception is happening when all the required jars are displayed above? 知道在上面显示所有必需的jar时为什么会发生classnotfound异常吗?

Rather than using hadoop -cp . printHadoop 而不是使用hadoop -cp . printHadoop hadoop -cp . printHadoop , try hadoop jar . printHadoop hadoop -cp . printHadoop ,尝试使用hadoop jar . printHadoop hadoop jar . printHadoop having done the following first: hadoop jar . printHadoop首先完成了以下工作:

export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:`echo *.jar`:`echo lib/*.jar | sed 's/ /:/g'`

This will add all jars found in the current directory (and the lib directory if you have one) to the classpath. 这会将当前目录(如果有的话,还有lib目录)中找到的所有jar添加到类路径中。

As I can see from your question you're trying to use completely wrong syntax. 从您的问题中可以看出,您正在尝试使用完全错误的语法。 Correct should be: 正确的应该是:

hadoop jar <your-jar> printHadoop

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM