I'm processing roughly not less than 25,000 records. However, it somehow exceed the maximum execution time. I am using Codeigniter 3.0.
The Records was the text data from a PDF that was processed by the library I made. It seemed to be not exceeding the execution time if I only show it, but when things starts to be complicated, like processing it to the database(MySQL), it exceeds the 300sec(I reconfigured this) execution time.
To illustrate
function process() {
$data = processThePDF(); //outputs the records / 25,000 records
if ($data) {
foreach ($data as $dt) {
$info = $this->just_another_model->view($dt['id']); //get the old record
if ($info) {
//update
$this->just_another_model->update([$params]);
//Log Update
$this->just_another_model->log([$params]);
} else {
//Register
$this->just_another_model->update([$params]);
//Log Register
$this->just_another_model->log([$params]);
}
}
}
}
So my questions are: 1. Is there a better way to optimize this? 2. Is convenient to write a json file or a text file before processing it?
Store your data in an array and update/insert it by batch.
function process() {
$data = processThePDF(); //outputs the records / 25,000 records
$update_data = array();
$update_data_2 = array();
if ($data) {
foreach ($data as $dt) {
$info = $this->just_another_model->view($dt['id']); //get the old record
if ($info) {
$update_data[] = array($params);
} else {
$update_data_2[] = array($params);
}
}
if(count(update_data) > 0)
$this->just_another_model->update_batch($update_data);
if(count(update_data_2) > 0)
$this->just_another_model_2->update_batch($update_data_2);
}
Furthermore, you can get your old records by batch before the loop and format it into array that so you can access it by id: $old_records[$dt['id']]
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.