简体   繁体   中英

Laravel upload a big CSV efficiently

I am trying to upload a CSV file to my database in laravel. But my CSV file is pretty big, I almost have 500 million rows that I want to import. (I am using Maatwebsite to do this)

And when I try to import it I am getting:

Maximum execution time of 300 seconds exceeded

As you can see I already changed the "max_input_time" in the php.init file. 300 seconds would be enough because datagrip takes only 3 minutes. And even if it would take longer in laravel there has to be another way than increasing the "max_input_time"

this is the code that's converting the data in a model and evantually putting it in de database:

public function model(array $row)
    {

        return new DutchPostalcode([
            'postalcode' => $row['PostcodeID'],
            'street' => $row['Straat'],
            'place' => $row['Plaats'],
            'government' => $row['Gemeente'],
            'province' => $row['Provincie'],
            'latitude' => $row['Latitude'],
            'longtitude' => $row['Longitude'],
        ]);

    }

this is my controller:

public function writeDutchPostalCodes(){
        Excel::import(new DutchPostalcodes, 'C:\Users\Moeme\Documents\Projects\ahmo apps\Apps\freshness\Freshness - be\FreshnessBE\resources\postalcodes\postcodetabel_1.csv');
    }

Use laravel queues.

https://laravel.com/docs/9.x/queues

For large processes you must do it at the background.

Increase the max_execution_time in php.ini Or split the file for processing similar to array_chunk

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM