I have psql set up on my MacOS terminal. It's hooked up to my PostgreSQL db running through Amazon RDS.
I have 100 CSV files (with names 1,2,3,4 up to 100). I want to bulk import them. I see there are some scripts ( https://dba.stackexchange.com/questions/168861/import-100-csv-files-into-postgresql/169089#169089 ), but I don't know how to run a script.
I tried copying and pasting this script -
for x in $(ls <folder_name>*.csv);
do psql -c "copy table_name from '$x' csv" also; done
And I received these errors -
db=> for x in $(ls <folder_name>*.csv);
ERROR: syntax error at or near "for"
LINE 1: for x in $(ls <folder_name>...
^
db=> do psql -c "copy <table_name> from '$x' csv" also; done
ERROR: syntax error at or near "psql"
LINE 1: do psql -c "copy <table_name> from '$x' csv" also;
Can you help me a) figure out the right script to bulk import these files and b) figure out how to execute the script?
Note - all the files are going to the same table, which already exists.
Considering this table:
CREATE TABLE t (id INT,description TEXT);
And the following files
id,description
1,foo
id,description
2,bar
Execute the following shell script:
#!/bin/bash
path="/home/user/files/"
for f in $path*.csv;do
cat $f | psql testdb -c "COPY yourtable FROM STDIN DELIMITER ',' CSV HEADER"
done
And there you have your data:
$ psql testdb -c "SELECT * FROM t"
id | description
----+-------------
1 | foo
2 | bar
(2 Zeilen)
If you are loading all into same table and if the headers can be excluded, one option would be to concatenate them into a single csv.
bash-$ head -1 /folder_name/file1.csv > combined_file.csv
bash-$ cat folder_name/*.csv | grep -v 'headerpattern' >>combined_file.csv
postgres# COPY yourtable FROM 'combined_file.csv' DELIMITER ','
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.