简体   繁体   中英

What's the best practice to import a large csv to normalized relational database( with multiple tables)

This is the structure of my database. Currently, I have a BIG csv file that contains all the fields. I'm thinking of several solutions:

  1. Split csv into multiple files, normalize them and then import all the csv into SQL.
  2. import the big csv into SQL first and then split them in the database

I'm still at a stage of learning SQL, so I want to find a naive way of importing the data. Any suggestions? 在此处输入图片说明

This is too long for a comment.

I would bring your giant CSV file into a staging table -- you might even want all the fields to be strings, if there might be data conversion issues.

Then, use the staging table to create the tables in your data model. If you are using Postgres, you can actually set up a single set of CTEs to load into all the tables in one expression.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM