简体   繁体   中英

Mysql Insert Large data

for now im trying to optimize some codes..

What is a better way to insert a large data into table?

Consider this code is running.

$arrayOfdata = execute query (SELECT c.id FROM table1 ) 

getting all the data from table1 storing it to array and inserting it on the table.

 private function insertSomeData($arrayOfdata, $other_id){
      foreach($arrayOfdata as $data){
                INSERT INTO table (
                    other_id,
                    other2_id,
                    is_transfered
                ) VALUES ('.$other_id.', '.$data['id'].', 0)'
            }
    }

i know if it have 500k of data in table1 this code is very slow. so i tried something like this.. i put all in one sql query

INSERT INTO 
table (
    other_id,
    other2_id,
    is_transfered
) 
SELECT 
    "other_id", c.id, 0 
FROM table1 

I read that to much large of data to insert cause the mysql to slow down or timeout. i tried this code on 500k of data in my local computer and it runs well..

is there any way that it will cause a problem if large data will be insert? other ways for faster insert that would not cause the server to use to0 much resources?

for inserting huge number of records, you should think of inserting them in batches instead of calling insert for each row.

like

INSERT INTO [Table] ([Column List]) 
       VALUES ([Value List 1])
            , ([Value List 2])
            , [...]
            , ([Value List N]);

updated to comments below

you may use a loop to generate a query with each records appended till the counter reaches the batch size. Then once the batch size is reached trigger a mysql insert and reset the counter.

Also if the data is too much and you just need to migrate it to a new table, you may think of appending the queries into a db.sql file and dump it directly via sql console like

USE newdb
SOURCE db.sql

You definitely don't want to fetch all your data from first table to client and then insert row by row into the target table.

I'd suggest to go with INSERT INTO ... SELECT FROM ... syntax for transferring data from one table to another. If you want to transfer data in batches you can always use LIMIT clause in SELECT with OFFSET .

Another approach would be to dump data you need in a file first with SELECT ... INTO OUTFILE and then use LOAD DATA INFILE to load it to another table. You can split your file if it's too much to load at once.

Also make sure that your target table doesn't have any indices while your're transferring your data. Once you finished with transfer create all necessary indices.

In my experience the fastest way to import massive sizes of data into a MySQL table is via a file. If the data is huge, the "insert into ... select from ..." might time out due to potential memory pressure- I've seen this happen. However, dumping the data from one table into a file and then loading it subsequently into another table via the same file works smoothly and is possibly the fastest way with large datasets.

To select data into a file, see this.

To load data from a file, have a look at this .

Hope this helps.

Following list of methods you can use for inserting large number of data

  1. Try locking table before starting insertion and then unlock.
  2. Disable MySQL indexes.
  3. Extended insert also can help you here.

Try to do in as a background job, in smaller portions. For example, limit the number of records to, let's say 100 records. Then run your script from crontab each 2 minutes. It will take much longer, but you will not stuck the server, between your iterations server will run normally. 100 records and 2 minutes are configurable parameters, you have to figure out your optimal values for them.

You can insert all of your data by a single query. this query run on mysql a single times.

just Try this.

private function insertSomeData($arrayOfdata, $other_id){
                $sql1 = "INSERT INTO table (
                                            other_id,
                                            other2_id,
                                            is_transfered
                                           ) VALUES ";  // initial query string  
               $sql2 = '';   
               foreach($arrayOfdata as $data){
                   $sql2 .= "('".$other_id."', '"..$data['id']."', '0'),"; // add value to query string
               }
              $sql2 = mb_substr($sql2, 0, -1); //remove last comma
              $sql = $sql1.$sql2; //build full query
              mysql_query($sql); //execute query. I suggest you to use mysqli_* or PDO. Because mysql_* is deprecated.  
    }

Assuming that you are using InnoDB engine (which is default in most recent MySQL versions), you should simply use transactions: wrap your insert loop into BEGIN; ... COMMIT; block.

By default, every statement is run as transaction, and server must make sure that data makes it safely to disk before continuing to next statement. If you start transaction, then do many inserts, and then commit transaction, only then server must flush all the data onto the disk. On modern hardware, this could amount only to few disk operations instead of 500k of them.

Another consideration is to use prepared statements . Server has to parse every SQL statement before executing it. This parsing does not come for free, and it is not unusual that parsing time could be more expensive than actual query execution time. Usually, this parsing is done every single time, and for your case it is done 500k times. If you use prepared statements, parsing/preparation is done only once, and cost to execute statement is only disk write (which is enhanced further if you are within active transaction, because server can batch that write by delaying it until transaction commits).

Total improvement from using these methods could be dramatic - I have seen situations when using transactions reduced total run time from 30 minutes to 20 seconds.

This answer should give some idea how to use transactions in PHP with MySQL.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM