簡體   English   中英

如何在DSE中使用Java API訪問Hive

[英]How to access Hive using Java API in DSE

我正在使用DSE 4.0.3上的具有2個cassandra,2個solr和1個hadoop節點的5節點集群,我正在嘗試通過Java api連接Hive。 以下是我要執行的程序

import java.sql.SQLException;
import java.sql.Connection;
import java.sql.ResultSet;
import java.sql.Statement;
import java.sql.DriverManager;

public class HiveJdbcClient {
    private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver";

 /**
 * @param args
 * @throws SQLException
  */
 public static void main(String[] args) throws SQLException {
    try {
      Class.forName(driverName);
    } catch (ClassNotFoundException e) {
    // TODO Auto-generated catch block
    e.printStackTrace();
    System.exit(1);
  }
  Connection con = DriverManager.getConnection("jdbc:hive://localhost:10000/default",  "", "");
  Statement stmt = con.createStatement();
  String tableName = "testHiveDriverTable";
  stmt.executeQuery("drop table " + tableName);
  ResultSet res = stmt.executeQuery("create table " + tableName + " (key int, value string)");
 // show tables
  String sql = "show tables '" + tableName + "'";
  System.out.println("Running: " + sql);
  res = stmt.executeQuery(sql);
  if (res.next()) {
    System.out.println(res.getString(1));
  }
  // describe table
  sql = "describe " + tableName;
  System.out.println("Running: " + sql);
  res = stmt.executeQuery(sql);
  while (res.next()) {
    System.out.println(res.getString(1) + "\t" + res.getString(2));
  }

  // load data into table
  // NOTE: filepath has to be local to the hive server
  // NOTE: /tmp/a.txt is a ctrl-A separated file with two fields per line
  String filepath = "/tmp/a.txt";
  sql = "load data local inpath '" + filepath + "' into table " + tableName;
  System.out.println("Running: " + sql);
  res = stmt.executeQuery(sql);

  // select * query
  sql = "select * from " + tableName;
  System.out.println("Running: " + sql);
  res = stmt.executeQuery(sql);
  while (res.next()) {
    System.out.println(String.valueOf(res.getInt(1)) + "\t" + res.getString(2));
  }

  // regular hive query
  sql = "select count(1) from " + tableName;
  System.out.println("Running: " + sql);
  res = stmt.executeQuery(sql);
  while (res.next()) {
    System.out.println(res.getString(1));
   }
  }
}

但是當我遇到錯誤時

Connection con = DriverManager.getConnection("jdbc:hive://localhost:10000/default", "", "");


Exception in thread "main" java.lang.NoSuchMethodError: org.apache.thrift.protocol.TProtocol.getScheme()Ljava/lang/Class;
at org.apache.hadoop.hive.service.ThriftHive$execute_args.write(ThriftHive.java:1076)
at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:63)
at org.apache.hadoop.hive.service.ThriftHive$Client.send_execute(ThriftHive.java:110)
at org.apache.hadoop.hive.service.ThriftHive$Client.execute(ThriftHive.java:102)
at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:192)
at org.apache.hadoop.hive.jdbc.HiveStatement.execute(HiveStatement.java:132)
at org.apache.hadoop.hive.jdbc.HiveConnection.configureConnection(HiveConnection.java:132)
at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:122)
at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106)
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at example.create.HiveTable.main(HiveTable.java:22)

注意:我已經在運行程序$ dse hive --service hiveserver時啟動了Thrift服務器

而且我正在得到

Connection con = DriverManager.getConnection("jdbc:hive2://localhost:10000/default", "", "");

Exception in thread "main" java.sql.SQLException: Invalid URL: jdbc:hive2://54.243.203.229:10000/default
at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:86)
at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106)
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at example.create.HiveTable.main(HiveTable.java:22)

注意:我已經在運行程序$ dse hive --service hiveserver2時啟動了Thrift服務器。

Hite-site.xml是:

<configuration>
<!--Hive Execution Parameters -->
  <property>
          <name>hive.exec.mode.local.auto</name>
          <value>false</value>
          <description>Let hive determine whether to run in local mode automatically</description>
    </property>
    <property>
          <name>hive.metastore.warehouse.dir</name>
          <value>cfs:///user/hive/warehouse</value>
          <description>location of default database for the warehouse</description>
    </property>
    <property>
           <name>hive.hwi.war.file</name>
           <value>lib/hive-hwi.war</value>
           <description>This sets the path to the HWI war file, relative to${HIVE_HOME}    </description>
     </property>
     <property>
            <name>hive.metastore.rawstore.impl</name>
          <value>com.datastax.bdp.hadoop.hive.metastore.CassandraHiveMetaStore</value>
            <description>Use the Apache Cassandra Hive RawStore implementation</description>
      </property>
      <property>
             <name>hadoop.bin.path</name>
             <value>${dse.bin}/dse hadoop</value>
       </property>
       <!-- Set this to true to enable auto-creation of Cassandra keyspaces as Hive Databases -->
      <property>
            <name>cassandra.autoCreateHiveSchema</name>
           <value>true</value>
       </property>
  </configuration>

任何人都可以提出建議。 我錯了或什么都沒想到的地方。

WSO2 DSS 3.2.2+中存在libthrift版本沖突。 它們在其部署中包括一個libthrift jar,並且在您將其放入components \\ lib目錄中之前先進行加載。 他們具有正確接口的更新的libthrift。 補救步驟:

重新安裝。 安裝路徑在此文檔中將稱為$ home

http://maven.wso2.org/nexus/content/groups/wso2-public/libthrift/wso2/libthrift/0.8.0.wso2v1/下載libthrift-0.8.0.wso2v1.jar。

將Windows環境變量CLASSPATH設置為$ \\ home \\ repository \\ components \\ lib(這可能不是必需的)

將libthrift 0.8復制到$ home \\ repository \\ components \\ plugins。 取出libthift 0.7罐

編輯$ home \\ repository \\ components \\ features \\ org.wso2.carbon.logging.mgt.server_4.2.1。 對於libthrift行,使其以version =“ 0.8.0.wso2v2”結尾

編輯$ home \\ repository \\ components \\ features \\ org.wso2.carbon.databridge.commons.thrift.server_4.2.0。 在那里也更改libthrift的版本。

我認為您缺少一些HiveServer2 JDBC jar文件和依賴項。 鏈接https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Clients提供了有關您需要添加到類路徑中的jar文件的文檔。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM