简体   繁体   English

有没有办法用 Gunicorn 限制来自一个 IP 的并发请求数?

[英]Is there a way to limit the number of concurrent requests from one IP with Gunicorn?

Basically I'm running a Flask web server that crunches a bunch of data and sends it back to the user.基本上我正在运行一个 Flask Web 服务器,它处理一堆数据并将其发送回用户。 We aren't expecting many users ~60, but I've noticed what could be an issue with concurrency.我们预计不会有很多用户 ~60,但我注意到并发性可能存在问题。 Right now, if I open a tab and send a request to have some data crunched, it takes about 30s, for our application that's ok.现在,如果我打开一个选项卡并发送一个请求来处理一些数据,大约需要 30 秒,对于我们的应用程序来说没问题。

If I open another tab and send the same request at the same time, unicorn will do it concurrently, this is great if we have two seperate users making two seperate requests.如果我打开另一个选项卡并同时发送相同的请求,独角兽将同时执行,如果我们有两个单独的用户发出两个单独的请求,那就太好了。 But what happens if I have one user open 4 or 8 tabs and send the same request?但是如果我让一个用户打开 4 或 8 个选项卡并发送相同的请求会发生什么? It backs up the server for everyone else, is there a way I can tell Gunicorn to only accept 1 request at a time from the same IP?它为其他人备份服务器,有没有办法告诉 Gunicorn 一次只接受来自同一 IP 的 1 个请求?

A better solution to the answer by @jon would be limiting the access by your web server instead of the application server. @jon 回答的更好解决方案是限制您的 Web 服务器而不是应用程序服务器的访问。 A good way would always be to have separation between the responsibilities to be carried out by the different layers of your application.一个好方法始终是将应用程序的不同层执行的职责分开。 Ideally, the application server, flask should not have any configuration for the limiting or anything to do with from where the requests are coming.理想情况下,应用程序服务器、flask 不应有任何限制配置或与请求来自何处有关的任何配置。 The responsibility of the web server, in this case nginx is to route the request based on certain parameters to the right client. Web 服务器的职责,在这种情况下 nginx 是根据某些参数将请求路由到正确的客户端。 The limiting should be done at this layer.限制应该在这一层完成。

Now, coming to the limiting, you could do it by using the limit_req_zone directive in the http block config of nginx现在,进入限制,您可以通过在 nginx 的 http 块配置中使用limit_req_zone指令来实现

http {
limit_req_zone $binary_remote_addr zone=one:10m rate=1r/s;

...

server {

    ...

    location / {
        limit_req zone=one burst=5;
        proxy_pass ...
    }

where, binary_remote_addr is the IP of the client and not more than 1 request per second at an average is allowed, with bursts not exceeding 5 requests.其中, binary_remote_addr为客户端IP,平均每秒不超过1个请求,突发不超过5个请求。

Pro-tip: Since the subsequent requests from the same IP would be held in a queue, there is a good chance of nginx timing out.专业提示:由于来自同一 IP 的后续请求将被保留在队列中,因此很有可能 nginx 超时。 Hence, it would be advisable to have a better proxy_read_timeout and if the reports take longer then also adjusting the timeout of gunicorn因此,最好有一个更好的 proxy_read_timeout,如果报告需要更长的时间,那么还要调整 gunicorn 的超时

Documentation of limit_req_zonelimit_req_zone文档

A blog post by nginx on rate limiting can be found here可以在此处找到 nginx 关于速率限制的博客文章

This is probably NOT best handled at the flask level.这可能不是最好在烧瓶级别处理。 But if you had to do it there, then it turns out someone else already designed a flask plugin to do just this:但是如果你不得不在那里做,那么事实证明其他人已经设计了一个烧瓶插件来做到这一点:

https://flask-limiter.readthedocs.io/en/stable/ https://flask-limiter.readthedocs.io/en/stable/

If a request takes at least 30s then make your limit by address for one request every 30s.如果一个请求至少需要 30 秒,那么请按地址限制每 30 秒一个请求。 This will solve the issue of impatient users obsessively clicking instead of waiting for a very long process to finish.这将解决不耐烦的用户痴迷点击而不是等待很长时间的过程完成的问题。

This isn't exactly what you requested, since it means that longer/shorter requests may overlap and allow multiple requests at the same time, which doesn't fully exclude the behavior you describe of multiple tabs, etc. That said, if you are able to tell your users to wait 30 seconds for anything, it sounds like you are in the drivers seat for setting UX expectations.这并不是您所要求的,因为这意味着更长/更短的请求可能会重叠并同时允许多个请求,这并不能完全排除您描述的多个选项卡等的行为。也就是说,如果您是能够告诉您的用户为任何事情等待 30 秒,这听起来就像您在设置用户体验期望方面处于主导地位。 Probably a good wait/progress message will help too if you can build an asynchronous server interaction.如果您可以构建异步服务器交互,一个好的等待/进度消息可能也会有所帮助。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 限制并发请求数aiohttp - limit number of concurrent requests aiohttp 在 python(无线程)中限制并发 http 请求的最佳方法? - Best way to limit concurrent http requests in python (no threads)? 限制最大并发连接数是否也限制了并发请求的数量? - Does limiting the maximum number of concurrent connections also limit the number of concurrent requests? 处理 1000 个 Flask/Gunicorn Web 服务的并发请求 - Handle 1000 concurrent requests for Flask/Gunicorn web service 如何让Flask / Gunicorn处理同一路由的并发请求? - How to get Flask/Gunicorn to handle concurrent requests for the same Route? 如何限制并发工人的数量? - How to limit the number of concurrent workers? 如何限制aiofiles的并发读/写数? - How to limit the number of concurrent read / write with aiofiles? 如何限制Python中的并发线程数? - How to limit number of concurrent threads in Python? 有什么方法可以从命令行读取仲裁者管理的 gunicorn 的活跃工人数量? - Any way to read number of active workers of gunicorn managed by the arbiter from command line? Django 是否使用一个线程来处理 WSGI 或 Gunicorn 中的多个请求? - Does Django use one thread to process several requests in WSGI or Gunicorn?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM