I have a dataset of 320 columns in a csv file where the header is simply the number of trip.
Each column represents a trip, a route of street names from A to B. (It was a simple list, that I transposed in excel to create the trip numbers as heads.)
I would like to import it to a postgreSQL table.
I saw previous similar question , but as it's only 320, I wonder if that is the best structure and if yes how to loop the column creation in order to insert the data through pgAdmin.
I figured so far this, that returns error message:
DO
$do$
BEGIN
FOR i IN 1..320 LOOP
INSERT INTO runs (col_i, col_id) -- use col names
SELECT i, id
FROM tbl;
END LOOP;
END
$do$;
Many thanks
To create a table with 320 text
columns:
DO
$$BEGIN
EXECUTE (
SELECT 'CREATE TABLE my_tbl (c'
|| string_agg(g::text, ' text, c')
|| ' text)'
FROM generate_series(1,320) g);
END$$;
A sane relational design might be (wild guess, there's not nearly enough information):
CREATE TABLE street (
street_id serial PRIMARY KEY
, street text NOT NULL
);
CREATE TABLE trip (
trip_id serial PRIMARY KEY
, whatever text
);
CREATE TABLE trip_step (
trip_id int REFERENCES trip
, step int
, street_id int NOT NULL REFERENCES street
, PRIMARY KEY (trip_id, step)
);
Consider:
Depending on your datatype max number of columns go from 250 to 1600, so that may be a problem
What is the maximum number of columns in a PostgreSQL select query
If you already have your data in excel, you can use save as CSV and use COPY
to import the data
https://stackoverflow.com/a/2987451/3470178
OR use string concatenation to create INSERT statmente in excel.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.