简体   繁体   中英

How to upload csv file into mysql database faster

I want to upload a csv file into mysql database through php. I already have more than 20,000 records in database. now when i upload csv file containing around 1000 records in it, it takes very much time to upload on local machine itself.

Please help and suggest optimized query to upload csv file into mysql database having large number of records.

Does number of records affect performance of database..??

EDIT from comments

Currently used code:

LOAD DATA INFILE '$file_name' IGNORE 
INTO TABLE import 
FIELDS TERMINATED BY '|' 
LINES TERMINATED BY '\n' 
IGNORE 1 LINES (@srno,@customer_name,@date,@mobno,@city,@state,@type,@telecaller) 
SET customer_name=@customer_name,date=@date,mobno=@mobno,city=@city, state=@state,type=@type,telecaller=@telecaller,datetime='$datetime';

USE command line tool of mysql: LOAD DATA INFILE

http://dev.mysql.com/doc/refman/5.1/en/load-data.html

Use fgetcsv() function to load data from csv file using php, format it the way you want and create the query to submit to mysql db. I have used it to create a db with 1.5M rows using csv files containing more than 10000 records in each file without any problem. 1000 records shouldn't be a problem.

Example:

      $h = fopen("your_file.csv", "r");
      $data = fgetcsv($h, "10000", ",");

You will have first line in your csv file in $data[0]. $data[0][0] will contain the first word delimited with ",". For example: if you have: "cat, dog, rat" then,

    $data[0][0] = "cat"
    $data[0][1] = "dog"

etc. Now since you have the records in the $data array, you can use it to make sqls and insert them to db

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM