I have to import data from csv file which has more than 100 columns. I've tried to import using PgAdmin and phpPgAdmin, but both of them require me to create the table with the columns first. Importing the columns 1 by 1 is very time consuming. I want to make it simple because all of the columns have same datatype which is text .
As we know, below code wont work because I have to specify the datatype for each column, which is time consuming.
CREATE TABLE my_table (
col_1 , col_2, ......., col_100
)
Is there a way to execute above operation with only specify the datatype only once for all columns? Thanks
the easiest way !
1- Copy the csv header into new file csv. 2- Edit manualy the file with notpad++ to pivot the data (replace ';' with '\\n;' ) in order to get the 100 column in a one column. assume that column names don't contain the separator of csv.
3- copy the column into a temp_table. copy 'tmp_table' from /tmp/listclmn.csv ... (without HEADER clause)
4-Generate the ddl of your table as
select 'create table my_table(' union select concat(colname, ' text, ') union select ');'
/* delete the last comma of text , the last column */
5- Create the generated table and copy all data from original csv using HEADER clause.
There is no such way of doing that, but if you want to import CSV file without writing the column names manually, then you can write a postgres procedure with the following logic.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.