[英]java.lang.NoClassDefFoundError with HBase Scan
I am trying to run a MapReduce job to scan a HBase table. 我正在尝试运行MapReduce作业来扫描HBase表。 Currently I am using the version 0.94.6 of HBase that comes with Cloudera 4.4.
目前我使用的是Cloudera 4.4附带的HBase版本0.94.6。 At some point in my program I use Scan(), and I properly import it with:
在我的程序中的某个时刻,我使用Scan(),我正确地导入它:
import org.apache.hadoop.hbase.client.Scan;
It compiles well and I am able to create a jar file too. 它编译得很好,我也可以创建一个jar文件。 I do it by passing the
hbase classpath
as the value for the -cp option. 我通过将
hbase classpath
作为-cp选项的值传递来实现。 When running the program, I obtain the following message: 运行程序时,我收到以下消息:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/client/Scan
I run the code using: 我运行代码使用:
hadoop jar my_program.jar MyJobClass -libjars <list_of_jars>
where list_of_jars contains /opt/cloudera/parcels/CDH/lib/hbase/hbase.jar. 其中list_of_jars包含/opt/cloudera/parcels/CDH/lib/hbase/hbase.jar。 Just to double-check, I confirmed that hbase.jar contains Scan.
为了仔细检查,我确认hbase.jar包含Scan。 I do it with:
我这样做:
jar tf /opt/cloudera/parcels/CDH/lib/hbase/hbase.jar
And I can see the line: 我可以看到这条线:
org/apache/hadoop/hbase/client/Scan.class
in the output. 在输出中。 All looks ok to me.
一切看起来都不错。 I don't understand why is saying that Scan is not defined.
我不明白为什么说Scan没有定义。 I pass the correct jar, and it contains the class.
我传递了正确的jar,它包含了类。
Any help is appreciated. 任何帮助表示赞赏。
设置HADOOP_CLASSPATH变量修复了问题:
export HADOOP_CLASSPATH=`/usr/bin/hbase classpath`
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.