简体   繁体   中英

Converting an array with 50,000+ entries into a mysql database

Ok, so I am helping someone with their site. Their original coder thought it would be a great idea to put their entire database in an array.

I am looping through each piece of data in the array and inserting it into a new mysql table I created, the rows are getting inserted just fine but the problem is that php wont let my script execute for more than 60 seconds.

How can I override this setting?

There's the classic solution, using set_time_limit (), which allows you to set the maximum execution time (in seconds) for a script.

Still, you might want to either insert multiple rows with one query or convert the array into CSV and use LOAD DATA which is much faster.

Edit : examples

Inserting multiple rows:

INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);

Ref: http://dev.mysql.com/doc/refman/5.1/en/insert.html

Using LOAD DATA:

LOAD DATA INFILE 'data.txt' INTO TABLE db2.my_table;

Ref: http://dev.mysql.com/doc/refman/5.1/en/load-data.html

Solutions:

  • Set max_execution_time to 0 in the php.ini or use set_time_limit()

  • Run the PHP script at the command-line. The time limit for CLI scripts is hardcoded to 0 already.

  • Dump the array out to a CSV file and then use LOAD DATA INFILE to make the database import go 20x faster than inserting one row at a time in a loop.

您可以使用页面顶部的set_time_limit()函数,如下所示:

set_time_limit(0);

Try ini_set('max_execution_time', $time_in_seconds);

Source

If you are using mysqli I would use multi_query to batch the operations. The whole thing will run a lot faster.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM