[英]read a csv file with comma as delimiter and escaping quotes in psql
I want to read a csv file which is separated by comma ( ,
) but want to ignore comma within the double quotes ( ""
). 我想读取一个用逗号(
,
)分隔的csv文件,
但要忽略双引号( ""
)中的逗号。 I want to store the result into a table. 我想将结果存储到表中。
Example: 例:
abc,00.000.00.00,00:00:00:00:00:00,Sun Nov 01 00:00:00 EST 0000,Sun Nov 01 00:00:00 EST 0000,"Apple, Inc.",abcd-0000abc-a,abcd-abcd-a0000-00
Here I don't want to split on Apple, . 在这里,我不想分裂苹果。
I know there exists csv reader in python and I can use it in plpython but that's slow considering millions of such strings! 我知道python中存在csv阅读器,我可以在plpython中使用它,但是考虑到数百万个这样的字符串,这样做的速度很慢! I would like a pure psql method!
我想要一个纯psql方法!
Here is an example of reading a CSV file with an External Table using the CSV format. 这是使用CSV格式读取带有外部表的CSV文件的示例。
CREATE EXTERNAL TABLE ext_expenses ( name text,
date date, amount float4, category text, desc1 text )
LOCATION ('gpfdist://etlhost-1:8081/*.txt',
'gpfdist://etlhost-2:8082/*.txt')
FORMAT 'CSV' ( DELIMITER ',' )
LOG ERRORS SEGMENT REJECT LIMIT 5;
This was taken from the Greenplum docs too. 这也来自Greenplum文档。
http://gpdb.docs.pivotal.io/530/admin_guide/external/g-example-4-single-gpfdist-instance-with-error-logging.html http://gpdb.docs.pivotal.io/530/admin_guide/external/g-example-4-single-gpfdist-instance-with-error-logging.html
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.