[英]Laravel upload a big CSV efficiently
I am trying to upload a CSV file to my database in laravel. But my CSV file is pretty big, I almost have 500 million rows that I want to import.我正在尝试将 CSV 文件上传到 laravel 中的数据库。但是我的 CSV 文件非常大,我几乎有 5 亿行要导入。 (I am using Maatwebsite to do this)
(我正在使用 Maatwebsite 来做到这一点)
And when I try to import it I am getting:当我尝试导入它时,我得到:
Maximum execution time of 300 seconds exceeded
As you can see I already changed the "max_input_time" in the php.init file.如您所见,我已经更改了 php.init 文件中的“max_input_time”。 300 seconds would be enough because datagrip takes only 3 minutes.
300 秒就足够了,因为 datagrip 只需要 3 分钟。 And even if it would take longer in laravel there has to be another way than increasing the "max_input_time"
即使在 laravel 中需要更长的时间,也必须有另一种方法而不是增加“max_input_time”
this is the code that's converting the data in a model and evantually putting it in de database:这是转换 model 中的数据并将其最终放入数据库中的代码:
public function model(array $row)
{
return new DutchPostalcode([
'postalcode' => $row['PostcodeID'],
'street' => $row['Straat'],
'place' => $row['Plaats'],
'government' => $row['Gemeente'],
'province' => $row['Provincie'],
'latitude' => $row['Latitude'],
'longtitude' => $row['Longitude'],
]);
}
this is my controller:这是我的 controller:
public function writeDutchPostalCodes(){
Excel::import(new DutchPostalcodes, 'C:\Users\Moeme\Documents\Projects\ahmo apps\Apps\freshness\Freshness - be\FreshnessBE\resources\postalcodes\postcodetabel_1.csv');
}
Use laravel queues.使用 laravel 队列。
https://laravel.com/docs/9.x/queues https://laravel.com/docs/9.x/queues
For large processes you must do it at the background.对于大型进程,您必须在后台进行。
Increase the max_execution_time
in php.ini
Or split the file for processing similar to array_chunk
增加
php.ini
中的max_execution_time
或者拆分文件进行类似array_chunk
的处理
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.