简体   繁体   中英

PHP, mysql memory leak

Faced with a problem, I can not find a memory leak when working with a database. The script takes a large data from database and therefore the memory leak critical. This problem occurs when working with mysqli, mysql or PDO. Here is test code:

$link = mysqli_connect('localhost', 'root', '');
if (!$link) {
    die('Connection error: ' . mysql_error());
}
mysqli_select_db($link, 'coolstat.my') or die ('Can\'t use coolstat.my: ' . mysql_error());


for($ii=0; $ii<20000; $ii+=1000){
    $sql= "SELECT `codes_data`.* FROM `codes_data` INNER JOIN codes ON codes.siteid= 20     AND codes.codeid=codes_data.codeid LIMIT ".$ii.", ".($ii+1000)."";
    ///

    $data= array();
    $result = mysqli_query($link, $sql);
    while (($row = mysqli_fetch_array($result))){
        $data[]= $row;
    }
    mysqli_free_result($result);
    unset($result);
    unset($data);
    echo "Memory get_data usage: ".convert_memory_val(memory_get_peak_usage(true))."<br />\n";
}
mysqli_close($link);


function convert_memory_val($size){
    $unit = array('b', 'kb', 'mb', 'gb', 'tb', 'pb');
    return @round($size / pow(1024, ($i = floor(log($size, 1024)))), 2) . ' ' . $unit[$i];
}

It output:

Memory get_data usage: 3.25 mb
Memory get_data usage: 6 mb
Memory get_data usage: 9 mb
Memory get_data usage: 11.75 mb
Memory get_data usage: 14.75 mb
Memory get_data usage: 17.75 mb
Memory get_data usage: 20.5 mb
Memory get_data usage: 23.5 mb
Memory get_data usage: 26.5 mb
Memory get_data usage: 29.5 mb
Memory get_data usage: 32.25 mb
Memory get_data usage: 35.25 mb
Memory get_data usage: 38.25 mb
Memory get_data usage: 41.25 mb
Memory get_data usage: 44 mb
Memory get_data usage: 47 mb
Memory get_data usage: 50 mb
Memory get_data usage: 53 mb
Memory get_data usage: 56 mb
Memory get_data usage: 58.75 mb

Your mistake is in the Limit clause: the 2nd number should be a constant, for ex. 1000. With what you have the queries will be

LIMIT 0, 1000
LIMIT 1000, 2000
LIMIT 2000, 3000
...

This is not pagination, you fetch data in increasing chunks that also overlap. So the increase in memory use is correct.

When the garbage collector is turned on, the cycle-finding algorithm as described above is executed whenever the root buffer runs full. The root buffer has a fixed size of 10,000 possible roots [...]

It is also possible to force the collection of cycles even if the possible root buffer is not full yet. For this, you can use the gc_collect_cycles() function. This function will return how many cycles were collected by the algorithm.

so .. just try to force garbage collection at the end of your loop body:

for ($ii = 0; $ii < 20000; $ii += 1000) {
    // ...

    mysqli_free_result($result);
    unset($result);
    unset($data);
    echo "Memory before GC run: ".convert_memory_val(memory_get_peak_usage(true))."<br />\n";

    $n = gc_collect_cycles();
    echo "GC collected $n garbage cycles<br />\n";
    echo "Memory after GC run: ".convert_memory_val(memory_get_peak_usage(true))."<br />\n";
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM