I currently have this code below, however when adding around 2000 rows this runs too slow due to being in an foreach loop.
foreach($tables as $key => $table) {
$class_name = explode("\\", get_class($table[0]));
$class_name = end($class_name);
$moved = 'moved_' . $class_name;
${$moved} = [];
foreach($table[0]->where('website_id', $website->id)->get() as $value) {
$value->website_id = $live_website_id;
$value->setAppends([]);
$table[0]::on('live')->updateOrCreate([
'id' => $value->id,
'website_id' => $value->website_id
], $value->toArray());
${$moved}[] = $value->id;
}
// Remove deleted rows
if ($table[1]) {
$table[0]::on('live')->where([ 'website_id' => $live_website_id ])
->whereNotIn('id', ${$moved})
->delete();
}
}
What is happening is basically users will add/update/delete data in a development server, then when they hit a button this data needs to be pushed into the live table, retaining the ID's as auto incremental id's on the live won't work due to look-up tables and multiple users launching data live at the same time.
What is the best way to do this? Should I simply remove all the data in that table (there is a unique identifier for chunks of data) and then just insert?
I think can do:
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.