简体   繁体   English

Scrapy "too many open files" 错误,由太多套接字引起

[英]Scrapy "too many open files" error, caused by too many sockets

I've met a problem that gets error 24: too many open files.我遇到了一个error 24: too many open files.的问题error 24: too many open files. After searching I've changed ulimit to 102400...搜索后,我已将 ulimit 更改为 102400 ...

However my spider still get this error after running for 1 week.但是,我的蜘蛛在运行 1 周后仍然出现此错误。

At first I thought it was caused by my pipeline(I changed this), so I checked the /proc/{pid}/fd and found so many socket opened.一开始我以为是我的pipeline引起的(我改了这个),于是我查看了/proc/{pid}/fd ,发现打开了这么多socket。

BTW, you can find code here: https://github.com/yz21606948/sinaSpider/tree/master/sina顺便说一句,你可以在这里找到代码: https : //github.com/yz21606948/sinaSpider/tree/master/sina

我通过使用增加限制来解决它

ulimit -n unlimited

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM