I have a report that I have to write code to run, it has to compile a month's worth of data into a CSV file. The data is something close to 112,000 lines long and 47 cells of data per line.
I'm running out of memory as I try to write the file. I thought something with fflush
might do it but it doesn't seem to. The code is in PHP.
$filename = str_replace(array('-',':','_', ' '), '', "export_" . date("m-d-Y H:i:s") . ".csv");
header('Content-Type: text/csv');
header("Content-Disposition: attachment; filename='$filename'");
header('Pragma: no-cache');
header('Expires: 0');
$con=mysqli_connect("localhost","root","changedForSecurity","reports");
// Check connection
if (mysqli_connect_errno())
{
echo "Failed to connect to MySQL: " . mysqli_connect_error();
}
$result = mysqli_query($con,"SELECT * FROM testing");
$row = mysqli_fetch_array($result, MYSQLI_ASSOC);
$fp = fopen($filename, 'w');
$i=1;
while($row = mysqli_fetch_array($result)) {
if ($i < 15000) {
fputcsv($fp, $row);
$i++;
} else {
fflush($fp);
$i=1;
fputcsv($fp, $row);
}
}
You will have to change your memory limit setting. This can be done either by manually changing it in the php.ini
file or you can do it in the code:
ini_set('memory_limit', '256M');
Or in the .htaccess
file:
php_value memory_limit 256M
One thing to note is that you should try to limit the amount you set here. If yours is set to 64M
then try 128M
first. Or if it's set to 128M
then try 256M
. It's overkill to set it too high if it's not necessary.
You can check how much memory you're using by doing this:
$initMemory = memory_get_usage();
// your code
$afterMemory = memory_get_usage() - $initMemory;
echo $afterMemory;
exit;
Apart from increasing your memory limit in the ini you could also just split the file. Not nearly as elegant, but it works in a pinch.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.