Using Laravel Eloquent, i'm copying 7Million rows of data from one table on my old Mysql database and putting these rows on different tables on my new Mysql database. The problem is that it took almost one day to perform this and i need to re-perform this action for almost 80M of rows. I'm using chunk of 1000 data at a time. Is there any way to do it more efficiently?? Here my code:
DB::connection('oldDataBase')->table('tableToCopy')->chunk(1000, function ($AllData){
foreach ($AllData as $Data){
DB::connection('newDataBase')->table('table1')->insert(
[
column1 => $Data->columnToCopy,
etc..
]);
DB::connection('newDataBase')->table('table2')->insert(
[
column1 => $Data->columnToCopy,
etc..
]);
DB::connection('newDataBase')->table('table3')->insert(
[
column1 => $Data->columnToCopy,
etc..
]);
}
});
Doing this data migration from a SQL client like laravel is not a good idea.
If I had to move 80M rows, I'd take the following steps:
LOAD DATA INFILE
to slurp up the CSV files one after the other. For fastest results this should be run from the mysql
or mysqlimport
command line client program running on the same machine as the MySQL server .I'd test this extensively before the migration day. I'd do things like load the first and last chunk of CSV and re-enable indexes and constraints.
Another possibility was suggested in a comment. Use mysqldump
, then load the resulting file via the mysql
client program.
Avoid the use of a gui-style mysql client program for this. Stick with the command-line programs. As good as those GUI clients are, they aren't engineered for streaming in multi-tens-of-megabyte .SQL files.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.