简体   繁体   English

Redis太多打开文件错误

[英]Redis too many open files error

I am getting "too many open files error" when a certain number of users exceeds (its around 1200 concurrent users). 当一定数量的用户超过(大约1200个并发用户)时,我得到“太多打开文件错误”。

I increased the limit using this but I was getting same error. 我使用这个增加了限制,但我得到了相同的错误。

Then I followed this and no change getting the same error. 然后我跟着这个没有变化得到同样的错误。

For creating connection I am using in my django settings and using REDIS when I need it. 为了创建连接,我在我的django设置中使用REDIS在需要时使用REDIS

REDIS = redis.StrictRedis(host='localhost', port=6379, db=0)

Why I did it like that because it was suggested in redis mailing list like below: 为什么我这样做,因为它是在redis邮件列表中建议的 ,如下所示:

a. 一个。 create a global redis client instance and have your code use that. 创建一个全局redis客户端实例并让您的代码使用它。

Is that approach right for connection pooling? 这种方法适合连接池吗? Or how I avoid this error of too many open files ? 或者我如何避免太多打开文件的错误? In Django response I am getting 在Django的反应中我得到了

Connection Error (Caused by : [Errno 24] Too many open files)",),)' 连接错误(由以下原因引起:[Errno 24]打开的文件太多)“,),)'

Thanks. 谢谢。

You are creating a ConnectionPool per connection; 您正在为每个连接创建一个ConnectionPool; depending on where you create the REDIS connection you might end up creating a new connection pool every time (eg. if its in view function). 根据您创建REDIS连接的位置,您可能最终每次都创建一个新的连接池(例如,如果它在视图功能中)。

You should make sure you create connections reusing an long lived connection pool; 您应该确保重新使用长期连接池创建连接; if you define the connection pool instance at module level and reuse that when you init connections you will be sure only 1 pool is created (one per python process at least). 如果您在模块级别定义连接池实例并在初始化连接时重用它,您将确保只创建了一个池(至少每个python进程一个)。

If you see the "too many open files error" on Redis with ulimit set way higher than the amount of users (eg. ulimit 10k and 1k connections from django) than you might be doing something that leads to Redis connection leaking (and therefore not being closed for an amount of time). 如果你看到Redis上的“太多打开文件错误”,ulimit设置的方式高于用户数量(例如uljit 10k和来自django的1k连接),那么你可能会做一些导致Redis连接泄漏的事情(因此不会被关闭了一段时间)。

I suggest you to start adding a connection pool and set a max connection limit there (its part of init signature); 我建议你开始添加一个连接池并在那里设置一个最大连接限制(它是init签名的一部分); make sure the pool raises an exception only when the actual amount of connect users > than the limit. 确保只有当连接用户的实际数量超过限制时,池才会引发异常。

If you can, increase the ulimit; 如果可以的话,增加ulimit; Redis can easily take more than 1k connections. Redis可以轻松地连接超过1k。

If you really want to limit the amount of connections between your python scripts and Redis you should consider using the BlockingConnectionPool which will let clients wait when all connections are in use (rather than throw an exception) or perhaps use something like twemproxy in between. 如果你真的想限制你的python脚本和Redis之间的连接数量,你应该考虑使用BlockingConnectionPool,这将允许客户端在所有连接都在使用时等待(而不是抛出异常),或者在它们之间使用类似twemproxy的东西。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM