简体   繁体   English

在处理MySQL查询结果时如何限制PHP内存使用?

[英]How do you limit PHP memory usage when processing MySQL query results?

So I have a PHP page that allows users to download CSV for what could be a whole bunch of records. 因此,我有一个PHP页面,允许用户下载CSV记录,这可能是一堆记录。 The problem is the more results the MySQL query returns, the more memory it uses. 问题是MySQL查询返回的结果越多,它使用的内存就越多。 That's not really surprising, but it does pose a problem. 这并不令人感到意外,但这确实带来了问题。

I tried using mysql_unbuffered_query() but that didn't make any difference, so I need some other way to free the memory used by what I assume are the previously processed rows. 我尝试使用mysql_unbuffered_query(),但这没有任何区别,因此我需要其他方法来释放我认为是先前处理过的行所使用的内存。 Is there a standard way to do this? 有标准的方法吗?

Here's a commented log that illustrates what I'm talking about: 这是一条评论日志,阐明了我在说什么:

// Method first called
2009-10-07 17:44:33 -04:00 --- info: used 3555064 bytes of memory

// Right before the query is executed
2009-10-07 17:44:33 -04:00 --- info: used 3556224 bytes of memory

// Immediately after query execution
2009-10-07 17:44:34 -04:00 --- info: used 3557336 bytes of memory

// Now we're processing the result set
2009-10-07 17:44:34 -04:00 --- info: Downloaded 1000 rows and used 3695664 bytes of memory
2009-10-07 17:44:35 -04:00 --- info: Downloaded 2000 rows and used 3870696 bytes of memory
2009-10-07 17:44:36 -04:00 --- info: Downloaded 3000 rows and used 4055784 bytes of memory
2009-10-07 17:44:37 -04:00 --- info: Downloaded 4000 rows and used 4251232 bytes of memory
2009-10-07 17:44:38 -04:00 --- info: Downloaded 5000 rows and used 4436544 bytes of memory
2009-10-07 17:44:39 -04:00 --- info: Downloaded 6000 rows and used 4621776 bytes of memory
2009-10-07 17:44:39 -04:00 --- info: Downloaded 7000 rows and used 4817192 bytes of memory
2009-10-07 17:44:40 -04:00 --- info: Downloaded 8000 rows and used 5012568 bytes of memory
2009-10-07 17:44:41 -04:00 --- info: Downloaded 9000 rows and used 5197872 bytes of memory
2009-10-07 17:44:42 -04:00 --- info: Downloaded 10000 rows and used 5393344 bytes of memory
2009-10-07 17:44:43 -04:00 --- info: Downloaded 11000 rows and used 5588736 bytes of memory
2009-10-07 17:44:43 -04:00 --- info: Downloaded 12000 rows and used 5753560 bytes of memory
2009-10-07 17:44:44 -04:00 --- info: Downloaded 13000 rows and used 5918304 bytes of memory
2009-10-07 17:44:45 -04:00 --- info: Downloaded 14000 rows and used 6103488 bytes of memory
2009-10-07 17:44:46 -04:00 --- info: Downloaded 15000 rows and used 6268256 bytes of memory
2009-10-07 17:44:46 -04:00 --- info: Downloaded 16000 rows and used 6443152 bytes of memory
2009-10-07 17:44:47 -04:00 --- info: used 6597552 bytes of memory

// This is after unsetting the variable. Didn't make a difference because garbage
// collection had not run
2009-10-07 17:44:47 -04:00 --- info: used 6598152 bytes of memory

I am hoping there is some sort of standard technique for dealing with large result sets like this (or even much larger), but my research hasn't turned up anything. 我希望可以使用某种标准技术来处理这样的大型结果集(或什至更大),但是我的研究没有发现任何问题。

Ideas? 有想法吗?

Here's some code, by request: 这是一些代码,应要求提供:

    $results = mysql_query($query);

    Kohana::log('info', "used " . memory_get_usage() . " bytes of memory");                

    $first = TRUE;
    $row_count = 0;

    while ($row = mysql_fetch_assoc($results)) {
        $row_count++;
        $new_row = $row;

        if (array_key_exists('user_id', $new_row)) {
            unset($new_row['user_id']);
        }

        if ($first) {
            $columns = array_keys($new_row);
            $columns = array_map(array('columns', "title"), $columns);
            echo implode(",", array_map(array('Reports_Controller', "_quotify"), $columns));
            echo "\n";
            $first = FALSE;
        }

        if (($row_count % 1000) == 0) {
            Kohana::log('info', "Downloaded $row_count rows and used " . memory_get_usage() . " bytes of memory");                
        }

        echo implode(",", array_map(array('Reports_Controller', "_quotify"), $new_row));
        echo "\n";
    }

Some further profiling reveals that the issue is a memory leak somewhere. 一些进一步的分析表明问题出在某处。 I stripped down the code to its simplest form and memory usage does not grow with each iteration. 我将代码简化为最简单的形式,并且内存使用量不会随着每次迭代的增长而增加。 I suspect it's Kohana (the framework I'm using). 我怀疑是Kohana(我正在使用的框架)。

Is this a "live" download? 这是“实时”下载吗? By that I mean are pushing this to the client when you're generating the CSV? 就是说,我是在生成CSV时将其推送给客户端吗? If so, then there are some things you can do: 如果是这样,那么您可以做一些事情:

  1. Don't use output buffering. 不要使用输出缓冲。 This saves everything in memory until you flush it explicitly or implicitly (by the script ending), which will use more memory; 这会将所有内容保存在内存中,直到您显式或隐式刷新(通过脚本结尾)为止,这将使用更多的内存。
  2. As you read rows from the database, write them to the client. 从数据库中读取行时,请将其写入客户端。

Other than that, we probably need to see some skeletal code. 除此之外,我们可能需要查看一些骨架代码。

Are you actually flushing data periodically? 您实际上是定期刷新数据吗? PHP's normal buffering can be pretty vicious for long-running code since there are multiple copies of data between the MySQL client, your variables and the output system. 对于长时间运行的代码,PHP的常规缓冲可能非常有害,因为MySQL客户端,变量和输出系统之间存在多个数据副本。 It's been a few years but I last recall using something like this in skeleton code: 已经有好几年了,但我上次回想起在骨架代码中使用了以下代码:

ob_end_flush()
mysql_unbuffered_query()
while ($row = mysql_fetch…) {
   … do something …   

   flush(); // Push to Apache
   unset($row, … all other temporary variables …);
}

Thanks for your question using mysql_unbuffered_query() solved my problem of running out of RAM with PHP and MYSQL working with a large dataset. 感谢您的问题,使用mysql_unbuffered_query()解决了使用大型数据集使用PHP和MYSQL耗尽RAM的问题。

PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes) in /content/apps/application_price.php on line 25 PHP致命错误:在第25行的/content/apps/application_price.php中,耗尽了134217728字节的允许内存大小(尝试分配32字节)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM