簡體   English   中英

無法使用CSV文件中的Presto創建Hive表

[英]Unable to create Hive table using Presto from a CSV File

我想使用Presto創建一個Hive表,並將數據存儲在S3的csv文件中。

我已將文件上傳到S3,並且我確信Presto能夠連接到存儲桶。

現在,當我發出create table命令時,查詢表時所有值(行)都為NULL。

我嘗試研究類似的問題,但事實證明Presto在Stackoverflow上並不那么出名。

文件中的某些行是:

PassengerId,Survived,Pclass,Name,Sex,Age,SibSp,Parch,Ticket,Fare,Cabin,Embarked
1,0,3,"Braund, Mr. Owen Harris",male,22,1,0,A/5 21171,7.25,,S
2,1,1,"Cumings, Mrs. John Bradley (Florence Briggs Thayer)",female,38,1,0,PC 17599,71.2833,C85,C
3,1,3,"Heikkinen, Miss. Laina",female,26,0,0,STON/O2. 3101282,7.925,,S
4,1,1,"Futrelle, Mrs. Jacques Heath (Lily May Peel)",female,35,1,0,113803,53.1,C123,S
5,0,3,"Allen, Mr. William Henry",male,35,0,0,373450,8.05,,S
6,0,3,"Moran, Mr. James",male,,0,0,330877,8.4583,,Q
7,0,1,"McCarthy, Mr. Timothy J",male,54,0,0,17463,51.8625,E46,S
8,0,3,"Palsson, Master. Gosta Leonard",male,2,3,1,349909,21.075,,S
9,1,3,"Johnson, Mrs. Oscar W (Elisabeth Vilhelmina Berg)",female,27,0,2,347742,11.1333,,S
10,1,2,"Nasser, Mrs. Nicholas (Adele Achem)",female,14,1,0,237736,30.0708,,C
11,1,3,"Sandstrom, Miss. Marguerite Rut",female,4,1,1,PP 9549,16.7,G6,S
12,1,1,"Bonnell, Miss. Elizabeth",female,58,0,0,113783,26.55,C103,S
13,0,3,"Saundercock, Mr. William Henry",male,20,0,0,A/5. 2151,8.05,,S
14,0,3,"Andersson, Mr. Anders Johan",male,39,1,5,347082,31.275,,S
15,0,3,"Vestrom, Miss. Hulda Amanda Adolfina",female,14,0,0,350406,7.8542,,S
16,1,2,"Hewlett, Mrs. (Mary D Kingcome) ",female,55,0,0,248706,16,,S
17,0,3,"Rice, Master. Eugene",male,2,4,1,382652,29.125,,Q
18,1,2,"Williams, Mr. Charles Eugene",male,,0,0,244373,13,,S
19,0,3,"Vander Planke, Mrs. Julius (Emelia Maria Vandemoortele)",female,31,1,0,345763,18,,S
20,1,3,"Masselmani, Mrs. Fatima",female,,0,0,2649,7.225,,C

我的csv文件在這里 ,從這里獲取train.csv 因此,我的presto命令是:

create table testing_nan_4 ( PassengerId integer, Survived integer, Pclass integer, Name varchar, Sex varchar, Age integer, SibSp integer, Parch integer, Ticket integer, Fare double, Cabin varchar, Embarked varchar ) with ( external_location = 's3://my_bucket/titanic_train/', format = 'textfile' );

結果是:

 passengerid | survived | pclass | name | sex  | age  | sibsp | parch | ticket | fare | cabin | embarked
-------------+----------+--------+------+------+------+-------+-------+--------+------+-------+----------
 NULL        | NULL     | NULL   | NULL | NULL | NULL | NULL  | NULL  | NULL   | NULL | NULL  | NULL
 NULL        | NULL     | NULL   | NULL | NULL | NULL | NULL  | NULL  | NULL   | NULL | NULL  | NULL
 NULL        | NULL     | NULL   | NULL | NULL | NULL | NULL  | NULL  | NULL   | NULL | NULL  | NULL
 NULL        | NULL     | NULL   | NULL | NULL | NULL | NULL  | NULL  | NULL   | NULL | NULL  | NULL
 NULL        | NULL     | NULL   | NULL | NULL | NULL | NULL  | NULL  | NULL   | NULL | NULL  | NULL
 NULL        | NULL     | NULL   | NULL | NULL | NULL | NULL  | NULL  | NULL   | NULL | NULL  | NULL
 NULL        | NULL     | NULL   | NULL | NULL | NULL | NULL  | NULL  | NULL   | NULL | NULL  | NULL
 NULL        | NULL     | NULL   | NULL | NULL | NULL | NULL  | NULL  | NULL   | NULL | NULL  | NULL

並期望得到實際數據。

當前,您必須輸入文本文件格式的0x1分隔('\\ u0001')文件,才能正確讀取該文件。 問題是Presto在這里不支持自定義定界符。

https://github.com/prestodb/presto/issues/10905

建議在此使用Hive DDL,並在Presto中輕松閱讀。

這是Hive查詢:

CREATE EXTERNAL TABLE mytable ( 
   PassengerId int, Survived int, Pclass int, Name string, Sex string, Age int, SibSp int, Parch int, Ticket int, Fare double, Cabin string, Embarked string 
)

ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.OpenCSVSerde'
WITH SERDEPROPERTIES (
  'separatorChar' = ',',
  'quoteChar' = '\"',
  'escapeChar' = '\\'
)
STORED AS TEXTFILE
LOCATION 's3://bucket-path/csv_data/'
TBLPROPERTIES (
  "skip.header.line.count"="1")

Starburst Presto當前支持CSV Hive存儲格式,請參閱: https ://docs.starburstdata.com/latest/release/release-302-e.html?highlight = csv

還有使其在PrestoSQL中起作用的工作,請參見: https : //github.com/prestosql/presto/pull/920

然后,您可以在Presto Hive連接器中使用以下表格:

CREATE TABLE hive.default.csv_table_with_custom_parameters (
    c_bigint varchar,
    c_varchar varchar)
WITH (
    csv_escape = '',
    csv_quote = '',  
    csv_separator = U&'\0001', -- to pass unicode character
    external_location = 'hdfs://hadoop/datacsv_table_with_custom_parameters',
    format = 'CSV')

您的情況是:

CREATE TABLE hive.default.csv_table_with_custom_parameters (
       PassengerId int, Survived int, Pclass int, Name string, Sex string, Age int, SibSp int, Parch int, Ticket int, Fare double, Cabin string, Embarked string)
WITH (
    csv_escape = '\',
    csv_quote = '"',  
    csv_separator = ',',
    external_location = 's3://my_bucket/titanic_train/',
    format = 'CSV')

請注意, csv_escapecsv_quotecsv_separator表屬性僅支持單個字符值。

同樣, "skip.header.line.count"="1"在Presto中還沒有CSV表的等效語法。 因此,我建議您從數據文件中刪除標題。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM