简体   繁体   English

未为创建的新配置单元表创建分区文件

[英]Partition file not creating for new hive table created

Table created successfully, but partition not created/ partition file is not creating. 表创建成功,但未创建分区/未创建分区文件。

 CREATE EXTERNAL TABLE table_name(col1,col2)
    PARTITIONED BY (`biz_dt` date) -- partition created 
    ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.OpenCSVSerde'
    WITH SERDEPROPERTIES('quoteChar'='\"','separatorChar'=',')
    STORED AS INPUTFORMAT 'org.apache.hadoop.mapred.TextInputFormat'                                           
    OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'                   
    LOCATION 'hdfs://path/'  
    TBLPROPERTIES ('skip.header.line.count'='1','transient_lastDdlTime'='1563368415');

When you create a new table, no files are created. 创建新表时,不会创建任何文件。 It will create only the folder where to store the file (if not exists) and that's all. 它将仅创建用于存储文件的文件夹(如果不存在),仅此而已。 The files will be created in the moment when you insert data into table. 当您将数据插入表时,将立即创建文件。

Also, no partition it's added until you add it (alter table, or dynamically through insert into table). 同样,在添加分区之前,不要添加分区(更改表,或通过插入表动态地添加分区)。

Hope that this it's helping you. 希望这对您有所帮助。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 创建具有lzo压缩的Hive表,无法找到扩展名为.lzo的文件 - Created Hive table with lzo compression, cant locate file with extension .lzo Java将Hive表插入到动态分区异常中 - Java spark to hive table insertion to dynamic partition exception Java 中的 Hive UDF 在创建表时失败 - Hive UDF in Java fails when creating a table 创建文件时,文件路径中的空格被%20替换,并且在新位置创建了文件 - space is replaced by %20 in file path while creating file and file is created at new location 为什么要创建一个新文件? - Why is a new File created? 新文件创建路径 - New file created path 创建一个新的Hibernate表 - Creating a new Hibernate table 如何使用 Java 获取在 TextInput/OutputFormat 中创建的配置单元表的架构 - How to get schema of a hive table created in TextInput/OutputFormat using Java 已创建配置单元表,但是映射任务失败,并带有异常 - Hive table created, however map task is failing with exception 从数据框'java.lang.IllegalArgumentException创建Hive表时出错:错误的FS:文件:/ tmp / spark预期:hdfs:// nameservice1' - Error while creating a Hive table from dataframe 'java.lang.IllegalArgumentException: Wrong FS: file:/tmp/spark expected: hdfs://nameservice1'
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM