[英]Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
I am using Hadoop 1.0.3 and HBase 0.94.22. 我使用的是Hadoop 1.0.3和HBase 0.94.22。 I am trying to run a mapper program to read values from a Hbase table and output them to a file. 我正在尝试运行映射器程序来读取Hbase表中的值并将它们输出到文件中。 I am getting the following error: 我收到以下错误:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:340)
at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
The code is as below 代码如下
import java.io.IOException;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.Result;
import org.apache.hadoop.hbase.client.Scan;
import org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter;
import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;
import org.apache.hadoop.hbase.mapreduce.TableMapper;
import org.apache.hadoop.hbase.util.Bytes;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
public class Test {
static class TestMapper extends TableMapper<Text, IntWritable> {
private static final IntWritable one = new IntWritable(1);
public void map(ImmutableBytesWritable row, Result value, Context context) throws IOException, InterruptedException
{
ImmutableBytesWritable userkey = new ImmutableBytesWritable(row.get(), 0 , Bytes.SIZEOF_INT);
String key =Bytes.toString(userkey.get());
context.write(new Text(key), one);
}
}
public static void main(String[] args) throws Exception {
HBaseConfiguration conf = new HBaseConfiguration();
Job job = new Job(conf, "hbase_freqcounter");
job.setJarByClass(Test.class);
Scan scan = new Scan();
FileOutputFormat.setOutputPath(job, new Path(args[0]));
String columns = "data";
scan.addFamily(Bytes.toBytes(columns));
scan.setFilter(new FirstKeyOnlyFilter());
TableMapReduceUtil.initTableMapperJob("test",scan, TestMapper.class, Text.class, IntWritable.class, job);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
System.exit(job.waitForCompletion(true)?0:1);
}
}
I get the above code exported to a jar file and on the command line I use the below command to run the above code. 我将上面的代码导出到jar文件中,在命令行中我使用下面的命令来运行上面的代码。
hadoop jar /home/testdb.jar test hadoop jar /home/testdb.jar测试
where test is the folder to which the mapper results should be written. 其中test是应该写入映射器结果的文件夹。
I have checked a few other links like Caused by: java.lang.ClassNotFoundException: org.apache.zookeeper.KeeperException where it has been suggested to include the zookeeper file in the classpath, but while creating the project in eclipse I have already included zookeeper file from the lib directory of hbase. 我已经检查了一些其他链接,例如: 引发者:java.lang.ClassNotFoundException:org.apache.zookeeper.KeeperException ,其中建议在类路径中包含zookeeper文件,但是在eclipse中创建项目时我已经包含了zookeeper来自hbase的lib目录的文件。 The file I have included is zookeeper-3.4.5.jar. 我包含的文件是zookeeper-3.4.5.jar。 Ans also visited this link too HBase - java.lang.NoClassDefFoundError in java , but I am using a mapper class to get the values from the hbase table not any client API. Ans也在java中访问了这个链接HBase - java.lang.NoClassDefFoundError ,但是我使用mapper类来获取hbase表中的值而不是任何客户端API。 I know I am making a mistake somewhere, guys could you please help me out ?? 我知道我在某个地方犯了一个错误,伙计们能帮帮我吗?
I have noted another strange thing, when I remove all of the code in the main function except the first line " HBaseConfiguration conf = new HBaseConfiguration();", then export the code to a jar file and try to compile the jar file as hadoop jar test.jar I still get the same error. 我注意到另一个奇怪的事情,当我删除main函数中的所有代码,除了第一行“HBaseConfiguration conf = new HBaseConfiguration();”,然后将代码导出到jar文件并尝试将jar文件编译为hadoop jar test.jar我仍然得到同样的错误。 It seems either I am defining the conf variable incorrectly or there is some issue with my environment. 似乎要么我错误地定义了conf变量,要么我的环境存在一些问题。
I got the fix to the problem, I had not added the hbase classpath in the hadoop-env.sh file. 我得到了问题的修复,我没有在hadoop-env.sh文件中添加hbase类路径。 Below is the one I added to make the job work. 下面是我为使工作而添加的那个。
$ export HADOOP_CLASSPATH=$HBASE_HOME/hbase-0.94.22.jar:\
$HBASE_HOME/hbase-0.94.22-test.jar:\
$HBASE_HOME/conf:\
${HBASE_HOME}/lib/zookeeper-3.4.5.jar:\
${HBASE_HOME}/lib/protobuf-java-2.4.0a.jar:\
${HBASE_HOME}/lib/guava-11.0.2.jar
I tried editing the hadoop-env.sh
file, but the changes mentioned here didn't work for me. 我尝试编辑hadoop-env.sh
文件,但这里提到的更改对我不起作用。
What worked is this: 有用的是:
export HADOOP_CLASSPATH="$HADOOP_CLASSPATH:$HBASE_HOME/lib/*"
I just added that at the end of my hadoop-env.sh
. 我刚刚在hadoop-env.sh
的末尾添加了它。 Do not forget to set your HBASE_HOME
variable. 不要忘记设置HBASE_HOME
变量。 You can also replace the $HBASE_HOME
with the actual path of your hbase installation. 您还可以将$HBASE_HOME
替换$HBASE_HOME
安装的实际路径。
In case there is someone who has different paths/configuration. 如果有人有不同的路径/配置。 Here is what I added to hadoop-env.sh
in order to make it work: 这是我添加到hadoop-env.sh
以使其工作的内容:
$ export HADOOP_CLASSPATH="$HBASE_HOME/lib/hbase-client-0.98.11-hadoop2.jar:\
$HBASE_HOME/lib/hbase-common-0.98.11-hadoop2.jar:\
$HBASE_HOME/lib/protobuf-java-2.5.0.jar:\
$HBASE_HOME/lib/guava-12.0.1.jar:\
$HBASE_HOME/lib/zookeeper-3.4.6.jar:\
$HBASE_HOME/lib/hbase-protocol-0.98.11-hadoop2.jar"
NOTE: if you haven't set the $HBASE_HOME
you have 2 choices. 注意:如果您尚未设置$HBASE_HOME
,则有2个选择。 - By export HBASE_HOME=[your hbase installation path]
- Or just replace the $HBASE_HOME
with your hbase full path - 通过export HBASE_HOME=[your hbase installation path]
- 或者只需用您的hbase完整路径替换$HBASE_HOME
HADOOP_USER_CLASSPATH_FIRST=true \
HADOOP_CLASSPATH=$($HBASE_HOME/bin/hbase mapredcp) \
hadoop jar /home/testdb.jar test
here CreateTable is my java class file 这里的CreateTable是我的java类文件
use this command 使用此命令
java -cp .:/home/hadoop/hbase/hbase-0.94.8/hbase-0.94.8.jar:/home/hadoop/hbase/hbase-0.94.8/lib/* CreateTable
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.