简体   繁体   English

通过HADOOP将数据插入HIVE

[英]Insert data into HIVE over HADOOP

I am using hadoop-1.0.4 and hive-0.10.0 in redhat5 . 我在redhat5中使用hadoop-1.0.4和hive-0.10.0 Service start successfully. 服务启动成功。 I am able to create, drop, select table easily but I don't know how to insert data. 我能够轻松地创建,删除,选择表,但我不知道如何插入数据。

For example I have two text box and on button click I want to store data in table (userInfo). 例如,我有两个文本框,按钮单击我想在表(userInfo)中存储数据。 I have no clue how to store textbox vaue in userInfo(id,password). 我不知道如何在userInfo(id,password)中存储文本框vaue。

private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver";


try {
          Class.forName(driverName);
        } catch (ClassNotFoundException e) {
          // TODO Auto-generated catch block
          e.printStackTrace();
          System.exit(1);
        }
        Connection con = DriverManager.getConnection("jdbc:hive://localhost:10000/enggheads","", "");
        Statement stmt = con.createStatement();
        String tableName = "testHiveDriverTable";
        stmt.executeQuery("drop table " + tableName);
        ResultSet res = stmt.executeQuery("create table " + tableName + " (key int, value string)");
        // show tables
        String sql = "show tables '" + tableName + "'";
        System.out.println("Running: " + sql);
        res = stmt.executeQuery(sql);
        if (res.next()) {
          System.out.println(res.getString(1));
        }

It's Java, but I don't know how to insert two field value because Hive insertion is different than MySQL or other database syntax. 它是Java,但我不知道如何插入两个字段值,因为Hive插入与MySQL或其他数据库语法不同。

Create a dummy table in hive like below 在hive中创建一个虚拟表,如下所示

create table dummy(dummy string) location '/path';

Above path will have a file which contains data X 上面的路径将有一个包含数据X的文件

Now run insert query from jdbc driver like below. 现在从jdbc驱动程序运行插入查询,如下所示。

insert into table tblname select forntendvalue1,frontendvalue2 from dual;

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM