简体   繁体   中英

PHP failed to open stream: Too many open files

I have a problem with the error: PHP failed to open stream: Too many open files.

I have looked on various answers here on stackoverflow, but I am unable to solve this issue. I have mainly tried to increase the limit of max. open files:

I have edited /etc/security/limits.conf where I specified this:

*       soft    nofile      10000
*       hard    nofile      30000

After saving and logging out / rebooting the box, the command:

ulimit -n

Still prints out 1024. I am not sure why this has no effect and I think this is the reason I get the php error. If needed I can paste the whole file or any other configuration file. I am using PHP 5.6, nginx 1.8.0 and php-fpm.

The solution which works for me now is to manually restart nginx with:

service nginx restart

After this things work again. Mainly the problem occurs when I run unit tests, behat tests or when I make a lot of requests to the web server.

Sounds like a long-running process is opening files and not closing them properly. Do you have any ideas which process might be doing that? Are you doing something that you would expect to open a large number of files? Sounds like it could be an issue with your unit-testing library. I'm not familiar with behat; have you searched for this error specifically in relation to the libraries/software you're using? When you talk about "making a lot of request to the web server", are those all 'concurrent' requests, which might well cause lots of file handles to be opened?

Ultimately, I think you need to solve the problem - if it is, indeed, a problem - of opening way more files than you're expecting to.

You should increase the per user file limit for user running php processes. Check with which user your php processes are running and increase the limit for them. You can do like this. $ cat /etc/security/limits.conf * hard nofile 500000 * soft nofile 500000 root hard nofile 500000 root soft nofile 500000 www-data hard nofile 500000 www-data soft nofile 500000

reference: https://rtcamp.com/tutorials/linux/increase-open-files-limit/

The solution was to do vagrant halt and then vagrant ssh again. Then it printed out the 10000. Looks like simple logout and login with the user was not enough for some reason.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM