简体   繁体   中英

Fatal Error - Too many open files

I try to run PHPUnit Tests in my new machine and I get this error:

PHP Fatal error: Uncaught exception 'UnexpectedValueException' with message 'RecursiveDirectoryIterator::__construct(/usr/lib/php/pear/File/Iterator): failed to open dir: Too many open files' in /usr/lib/php/pear/File/Iterator/Factory.php:114

The same code on the old machine run well...

New machine environment: PHP Version: PHP 5.3.21 (cli) Older: PHP 5.3.14

PHPUnit output every time:

................EEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEE 65 / 66 ( 98%)
E

Time: 34 seconds, Memory: 438.50Mb

There were 50 errors:

1) XXXXXXXXXXX
PHP Fatal error:  Uncaught exception 'UnexpectedValueException' with message 'RecursiveDirectoryIterator::__construct(/usr/lib/php/pear/File/Iterator): failed to open dir: Too many open files' in /usr/lib/php/pear/File/Iterator/Factory.php:114

This can be a limitation on the server where the code is running. Every operating system only allows for a certain number of open files/handles/sockets. This limit is usually further reduced when the server is virtualized. On a Linux server you can check the current limit with ulimit -n , if you have root access you can increase it with the same command. I assume there is a method for Windows server as well. Otherwise there is not much you can do about it (except ask your hoster or administrator to increase it).

More configurable limits:

Change in /etc/security/limits.conf

soft nofile 1024
hard nofile 65535

Increase ulimit by ulimit -n 65535 or echo 65535 > /proc/sys/fs/file-max or in /etc/sysctl.conf :

fs.file-max=65535

How can you up file open limit (Linux or Max OS):

ulimit -n 10000

Solves problem with phpunit or/and phpdbg and Warning: Uncaught ErrorException: require([..file]): failed to open stream: Too many open files in [...]

在php中,在执行之前,试试这个

exec('ulimit -S -n 2048');

Don't store DirectoryIterator objects for later; you will get an error saying "too many open files" when you store more than the operating system limit (usually 256 or 1024).

For example, this will yield an error if the directory has too many files:

<?php 
$files = array(); 
foreach (new DirectoryIterator('myDir') as $file) { 
    $files[] = $file; 
} 
?>

Presumably, this approach is memory intensive as well.

source: http://php.net/manual/pt_BR/directoryiterator.construct.php#87425

After 'waking' my computer from sleep mode I ran into this problem.

Restarting php-fpm like so fixed it. Classic turn it off & back on again solution.

sudo /etc/init.d/php-fpm restart

I think this may be related to xdebug which I recently added to php.

I've noticed this occur in PHP when you forget to wrap something in a closure. Carefully look at your recent diffs and you might be able to get to the bottom of this (in my case, I referenced $faker in a Laravel PHP unit factory without having a closure.

I experience this error in relation to a Http pool, where I added too many urls into the pool (around 2000 urls).

I had to chunk the urls into smaller batches, and the error stopped.

I think its how the Guzzle Pool works, it doesn't close the curl connections before the entire pool is done.

Ex.

$responses = Http::pool(function (Pool $pool) use ($chunk) {
    return collect($chunk)->map(fn($url) => $pool->get($url));
});

Becomes:

collect($urls)
    ->chunk(25)
    ->each(function ($chunk) {
        $responses = Http::pool(function (Pool $pool) use ($chunk) {
            return collect($chunk)->map(fn($url) => $pool->get($url));
        });
    });

The Http function is a wrapper function from Laravel using Guzzle Http Client. https://laravel.com/docs/9.x/http-client

on server debian you can go to also to

/etc/php/php7.xx/fpm/pool.d/www.conf

rlimit_files = 10000

/etc/init.d/php7.xx restart

Maybe, you have some error with file /etc/init.d/phpx.x-fpm . Let's restart it:

sudo /etc/init.d/php7.2-fpm restart

I got this error, every time about a Redis library PHP was trying to load, but it was caused by something I didn't really think of at first. I kept getting this error when my program was running a while, doing a repetitive process. I found out I opened a cURL session ( $ch = new curl_init(...) ), which was closed in the destructor of a class, but that destructor was never called. I fixed that problem and the too-many-files open error disappeared.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM