简体   繁体   English

在nginx / php-fpm配置上同时运行的脚本有什么限制?

[英]What is the limit on scripts running at the same time on a nginx/php-fpm config?

The problem is I have to use curl and sometimes the curl requests take a long time because of the timeouts. 问题是我必须使用curl,有时由于超时,curl请求需要很长时间。 I have set the timeouts to 1 second so no request should take more than 1 second but still the server is unable to process other php requests. 我已将超时设置为1秒,因此任何请求都不应超过1秒,但服务器仍无法处理其他php请求。

My question is how many concurrent scripts(running at the same time) can nginx/php-fpm handle. 我的问题是nginx / php-fpm能处理多少个并发脚本(同时运行)。 What I see is that a few requests lasting 1 second make the whole server unresponsive. 我看到的是,一些持续1秒的请求使整个服务器无响应。 What are the settings that I can change so more requests can be processed at the same time? 我可以更改哪些设置,以便可以同时处理更多请求?

Multicurl is indeed not the solution to your probleme, but asynchrousity probably is. 实际上,Multicurl并不是解决问题的方法,但异步性可能是解决方案。 I am not sure that the solution is tweaking Nginx. 我不确定解决方案是否在调整Nginx。 It would scale better if you were to consider one of the following options : 如果您考虑以下选项之一,它将更好地扩展:

  • You can abstract Curl with Guzzle http://docs.guzzlephp.org/en/latest/ and use their approach to async call and promises. 您可以使用Guzzle http://docs.guzzlephp.org/en/latest/来抽象Curl,并使用其方法进行异步调用和Promise。

  • You can use Gearmand http:/gearman.org/getting-started/ which will enable you to send an async message to a remote server which will process the instruction based on a script you register to your message. 您可以使用Gearmand http:/gearman.org/getting-started/来将异步消息发送到远程服务器,该服务器将根据您注册到消息的脚本来处理指令。 (I use this mechanism for non blocking logging) (我将这种机制用于非阻塞日志记录)

Either way, your call will be made in milliseconds and won't block your nginx but your code will have to change a little bit. 无论哪种方式,您的调用都将在毫秒内完成,并且不会阻止您的nginx,但是您的代码将不得不稍作更改。

Php-curl did not respond in a timely manner because of DNS. php-curl由于DNS无法及时响应。

The problem was that I had to access files from a CDN but the IP behind the domain changed frequently and unfortunately curl keeps a DNS cache. 问题是我必须从CDN访问文件,但是域后面的IP经常更改,不幸的是curl保留了DNS缓存。

So from time to time it would try to access files from IPs that were not valid anymore, but they were still in the DNS cache of php-curl. 因此,它会不时尝试从不再有效的IP访问文件,但这些文件仍位于php-curl的DNS缓存中。

I had to drop php-curl completely and use a plain file_get_contents(...) request. 我不得不完全放弃php-curl并使用普通的file_get_contents(...)请求。 This completely solved the problem. 这样就完全解决了问题。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM