简体   繁体   中英

MySql load data infile duplicates

I have a large CSV file of 12000 rows and 90 columns.

I want to use the mysql load data infile query to upload it to my mysql database.

But I keep getting the error that my CSV has duplicates on the primary key.

I am sure that it does not have duplicates on the primary key.

What could be the problem?

here is my code

$sql = "LOAD DATA INFILE '/a_bysch_store (2).csv' INTO TABLE a_bysch"
. " FIELDS TERMINATED BY ','"
. " LINES TERMINATED BY '\r\n'"
. " IGNORE 1 LINES"; 

//Try to execute query (not stmt) and catch mysqli error from engine and php error
if (!($stmt = $mysqli->query($sql))) {
   echo "\nQuery execute failed: ERRNO: (" . $mysqli->errno . ") " . $mysqli->error;
}

Instead of LOAD DATA INFILE use LOAD DATA LOCAL INFILE. This converts duplicate key errors to warnings and the file should import with duplicates skipped. This is the same as using the IGNORE switch. You can probably use either one. I see you are using IGNORE 1 Lines, but that doesn't ignore duplicate keys.

Also, if you are using an autoincrement as a primary key, don't pass that value in the csv file.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM