[英]Cannot connect to HBase using Java
我正在尝试使用Java连接HBase。 只有1个节点,这是我自己的机器。 看来我无法成功连接。
这是我的Java代码:
public class Test {
public static void main(String[] args) throws MasterNotRunningException, ZooKeeperConnectionException, IOException, ServiceException {
SparkConf conf = new SparkConf().setAppName("Test").setMaster("spark://10.239.58.111:7077");
JavaSparkContext sc = new JavaSparkContext(conf);
sc.addJar("/home/cloudera/workspace/Test/target/Test-0.0.1-SNAPSHOT.jar");
Configuration hbaseConf = HBaseConfiguration.create();
hbaseConf.addResource(new Path("/usr/lib/hbase/conf/hbase-site.xml"));
HTable table = new HTable(hbaseConf, "rdga_by_id");
}
}
我试图在这样的代码中设置环境,
hbaseConf.set("hbase.master", "localhost");
hbaseConf.set("hbase.master.port", "60000");
hbaseConf.set("hbase.zookeeper.property.clientPort", "2181");
hbaseConf.set("hbase.zookeeper.quorum", "quickstart.cloudera");
hbaseConf.set("hbase.zookeeper.quorum", "localhost");
但仍然不起作用。
这是hbase-site.xml:
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>hbase.rest.port</name>
<value>8070</value>
<description>The port for the HBase REST server.</description>
</property>
<property>
<name>hbase.cluster.distributed</name>
<value>true</value>
</property>
<property>
<name>hbase.rootdir</name>
<value>hdfs://quickstart.cloudera:8020/hbase</value>
</property>
<property>
<name>hbase.regionserver.ipc.address</name>
<value>0.0.0.0</value>
</property>
<property>
<name>hbase.master.ipc.address</name>
<value>0.0.0.0</value>
</property>
<property>
<name>hbase.thrift.info.bindAddress</name>
<value>0.0.0.0</value>
</property>
</configuration>
在服务器运行时的Web UI页面中,它说serverName是“ quickstart.cloudera,16201,1422941563375”。
错误就这样了
2015-02-02 22:17:03,121 INFO [main] zookeeper.ZooKeeper (ZooKeeper.java:<init>(438)) - Initiating client connection, connectString=quickstart.cloudera:16201 sessionTimeout=90000 watcher=hconnection-0x62ad0636, quorum=quickstart.cloudera:16201, baseZNode=/hbase
Exception in thread "main" java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:413)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:390)
at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:271)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:198)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:160)
at Test.main(Test.java:52)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:411)
... 12 more
Caused by: java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace
at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:216)
at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:839)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:642)
... 17 more
Caused by: java.lang.ClassNotFoundException: org.cloudera.htrace.Trace
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 23 more
抱歉让你们读了这么多代码。 提前致谢
造成原因:
java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace
基于此行,错误堆栈跟踪(包括htrace-core.jar)可能会有所帮助。
对于Spark-HBase集成,最好的方法是将HBase库添加到Spark Classpath。 这可以使用$ SPARK_HOME / bin文件夹中的'compute-classpath.sh'脚本来完成。 Spark调用“ compute-classpath.sh”并因此获取所需的hbase jar。
export CLASSPATH=$CLASSPATH:<path/to/HBase/lib/*>
例如:export CLASSPATH = $ CLASSPATH:/ opt / cloudera / parcels / CDH / lib / hbase / lib / *
之后,重新启动Spark。
你去了:)
提供此罐子的完整路径,如下所示->
sc.addJar("htrace-core.jar");
从没有火花的java库连接到Hbase时,也会出现此错误。 我在类路径下添加了它,但是没有用
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/lib/hbase/*
但是后来我添加了hbase-solr路径,因为此路径中存在htrace jar
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/lib/hbase/*:/usr/lib/hbase-solr/lib/*
希望这对您有用。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.