简体   繁体   English

PHP,mysql内存泄漏

[英]PHP, mysql memory leak

Faced with a problem, I can not find a memory leak when working with a database. 面对一个问题,我在使用数据库时找不到内存泄漏。 The script takes a large data from database and therefore the memory leak critical. 该脚本从数据库中获取大量数据,因此内存泄漏至关重要。 This problem occurs when working with mysqli, mysql or PDO. 使用mysqli,mysql或PDO时会出现此问题。 Here is test code: 这是测试代码:

$link = mysqli_connect('localhost', 'root', '');
if (!$link) {
    die('Connection error: ' . mysql_error());
}
mysqli_select_db($link, 'coolstat.my') or die ('Can\'t use coolstat.my: ' . mysql_error());


for($ii=0; $ii<20000; $ii+=1000){
    $sql= "SELECT `codes_data`.* FROM `codes_data` INNER JOIN codes ON codes.siteid= 20     AND codes.codeid=codes_data.codeid LIMIT ".$ii.", ".($ii+1000)."";
    ///

    $data= array();
    $result = mysqli_query($link, $sql);
    while (($row = mysqli_fetch_array($result))){
        $data[]= $row;
    }
    mysqli_free_result($result);
    unset($result);
    unset($data);
    echo "Memory get_data usage: ".convert_memory_val(memory_get_peak_usage(true))."<br />\n";
}
mysqli_close($link);


function convert_memory_val($size){
    $unit = array('b', 'kb', 'mb', 'gb', 'tb', 'pb');
    return @round($size / pow(1024, ($i = floor(log($size, 1024)))), 2) . ' ' . $unit[$i];
}

It output: 它输出:

Memory get_data usage: 3.25 mb
Memory get_data usage: 6 mb
Memory get_data usage: 9 mb
Memory get_data usage: 11.75 mb
Memory get_data usage: 14.75 mb
Memory get_data usage: 17.75 mb
Memory get_data usage: 20.5 mb
Memory get_data usage: 23.5 mb
Memory get_data usage: 26.5 mb
Memory get_data usage: 29.5 mb
Memory get_data usage: 32.25 mb
Memory get_data usage: 35.25 mb
Memory get_data usage: 38.25 mb
Memory get_data usage: 41.25 mb
Memory get_data usage: 44 mb
Memory get_data usage: 47 mb
Memory get_data usage: 50 mb
Memory get_data usage: 53 mb
Memory get_data usage: 56 mb
Memory get_data usage: 58.75 mb

Your mistake is in the Limit clause: the 2nd number should be a constant, for ex. 你的错误在极限条款中:第二个数字应该是一个常数,例如。 1000. With what you have the queries will be 1000.你有什么问题

LIMIT 0, 1000
LIMIT 1000, 2000
LIMIT 2000, 3000
...

This is not pagination, you fetch data in increasing chunks that also overlap. 这不是分页,您可以在增加的块中获取数据,这些块也会重叠。 So the increase in memory use is correct. 因此,内存使用的增加是正确的。

When the garbage collector is turned on, the cycle-finding algorithm as described above is executed whenever the root buffer runs full. 垃圾收集器打开时,只要根缓冲区运行满,就执行如上所述的循环查找算法。 The root buffer has a fixed size of 10,000 possible roots [...] 根缓冲区的固定大小为10,000个可能的根[...]

It is also possible to force the collection of cycles even if the possible root buffer is not full yet. 即使可能的根缓冲区尚未填满,也可以强制收集循环。 For this, you can use the gc_collect_cycles() function. 为此,您可以使用gc_collect_cycles()函数。 This function will return how many cycles were collected by the algorithm. 此函数将返回算法收集的循环数。

so .. just try to force garbage collection at the end of your loop body: 所以..只是尝试在循环体的末尾强制垃圾收集

for ($ii = 0; $ii < 20000; $ii += 1000) {
    // ...

    mysqli_free_result($result);
    unset($result);
    unset($data);
    echo "Memory before GC run: ".convert_memory_val(memory_get_peak_usage(true))."<br />\n";

    $n = gc_collect_cycles();
    echo "GC collected $n garbage cycles<br />\n";
    echo "Memory after GC run: ".convert_memory_val(memory_get_peak_usage(true))."<br />\n";
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM