简体   繁体   中英

PHP Allowed memory size Memory size exhausted

I am trying to display users on a map using google API. Now that when users count increase to 12000 I got an memory exception error. For the time being I increased the memory to 256 from 128. But I am sure when its 25000 users again the same error will come.

The error is, Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 30720 bytes) in /var/www/html/digitalebox_v2_20150904_100/protected/extensions/bootstrap/widgets/TbBaseMenu.php on line 193 Fatal error: Class declarations may not be nested in /var/www/html/yii-1.1.14.f0fee9/framework/collections/CListIterator.php on line 20`

My code,

$criteria = new CDbCriteria(); $criteria->condition = 't.date BETWEEN :from_date AND :to_date';
$modelUsers = User::model()->findAll($criteria); foreach ($modelUsers as $modelUser) { $fullAddress = $modelUser->address1 . ', ' . $modelUser->zip . ' ' . $modelUser->city . ', ' . Country::model()->findByPk($modelUser->countryCode)->countryName; }

when this $modelUsers has 12000 records this memory problem comes as its 12000 user objects.

what should i do to prevent this kind of issues in future ? what is the minimum required memory size for a php application to run ?

When you call findAll it loads all records in time, so you get end memory error. Speacialy for that situations Yii has CDataProviderIterator . It allows iteration over large data sets without holding the entire set in memory.

$dataProvider = new CActiveDataProvider("User");
$iterator = new CDataProviderIterator($dataProvider);
foreach($iterator as $user) {
   $fullAddress = $modelUser->address1 . ', ' . $modelUser->zip . ' ' . $modelUser->city . ', ' . Country::model()->findByPk($modelUser->countryCode)->countryName;
}

I'd solve this problem in a completely different way than suggested. I'd ditch the whole model concept and I'd query MySQL for the addresses. You can query MySQL so it returns already concatenated address, which means you can avoid concatenating it in PHP - that avoids wasting memory.

Next step would be using PDO and issuing an unbuffered query. This means that PHP process will not store entire 12 000 records in its memory - that's how you avoid exhausting the memory.

Final step is outputting the result - as you loop through the unbuffered query, you can use output a single row (or 10 rows) at a time.

What happens this way is that you trade CPU for RAM. Since you don't know how much RAM you will need, the best approach is to use as little as possible.

Using unbuffered queries and flushing PHP's output buffer seems like the only viable way to go for your use case, seeing you can't avoid outputting a lot of data.

Perhaps increase the memory a bit more and see if that corrects the issue. Try setting it to 512M or 1024M. I will tell you from my own experience, if you are trying to load google map markers that number of markers will probably crash the map.

Increase the value of memory_limit in php.ini . Then restart Apache or whatever PHP is running under. Keep in mind that what you add here needs to be taken away from other programs running on the same machine (such MySQL and its innodb_buffer_pool_size ).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM