简体   繁体   English

OSError: [Errno 24] 打开的文件太多; 在 python; 难以调试

[英]OSError: [Errno 24] Too many open files; in python; difficult to debug

I am running code which after sometimes hours, sometimes minutes fails with the error我正在运行的代码有时会在数小时后失败,有时会出现错误

OSError: [Errno 24] Too many open files

And I have real trouble debugging this.我在调试这个时遇到了真正的麻烦。 The error itself is always triggered by the marked line in the code snippet below错误本身总是由下面代码片段中标记的行触发

try:
    with open(filename, 'rb') as f:
        contents = f.read()       <----- error triggered here
except OSError as e:
    print("e = ", e)
    raise
else:
    # other stuff happens

However, I can't see any problem in this part of the code (right?) so I guess that other parts of the code don't close files properly.但是,我在这部分代码中看不到任何问题(对吗?)所以我猜想代码的其他部分没有正确关闭文件。 However, while I do open files quite a bit, I always open them with the 'with' statement, and my understanding is that even if an error occurs the files will be closed (right?).然而,虽然我经常打开文件,但我总是使用“with”语句打开它们,我的理解是即使发生错误,文件也会关闭(对吧?)。 So another part of my code looks like this所以我的代码的另一部分看起来像这样

    try:
        with tarfile.open(filename + '.tar') as tar:
            tar.extractall(path=target_folder)
    except tarfile.ReadError as e:
        print("e = ", e)
    except OSError as e:
        print("e = ", e)
    else:
        # If everything worked, we are done
        return

The code above does run into a ReadError quite frequently, but even if that happens, the file should be closed, right?上面的代码确实经常遇到 ReadError,但即使发生这种情况,文件也应该关闭,对吧? So I just don't understand how I can run into too many open files?所以我只是不明白我怎么会遇到太多打开的文件? Sorry this is not reproducible for you, since I can't debug it enough, I just fishing for some tips here, since I am lost.抱歉,这对您来说不可重现,因为我无法对其进行足够的调试,我只是在这里寻找一些技巧,因为我迷路了。 Any help is appreciated...任何帮助表示赞赏...

Edit: I am on a macbook.编辑:我在 macbook 上。 Here is the output of ulimit -a这里是ulimit -a的output

core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
file size               (blocks, -f) unlimited
max locked memory       (kbytes, -l) unlimited
max memory size         (kbytes, -m) unlimited
open files                      (-n) 256
pipe size            (512 bytes, -p) 1
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 1418
virtual memory          (kbytes, -v) unlimited

Following the suggestion by @sj95126 I changed the code concerning the tar file to something which ensures that the file is closed按照@sj95126 的建议,我将有关 tar 文件的代码更改为确保文件已关闭的代码

try:
    tar = tarfile.open(filename + '.tar')
    tar.extractall(path=target_folder)
except tarfile.ReadError as e:
    print("tarfile.ReadError e = ", e)
except OSError as e:
    print("e = ", e)
else:
    # If everything worked, we are done
    return
finally:
    print("close tar file")
    try:
        tar.close()
    except:
        print("file already closed")

but it did not solve the problem.但它并没有解决问题。

on unix/linux systems there is a command with which you can check the total number of file locks or open files limit using ulimit -a .unix/linux系统上有一个命令,您可以使用ulimit -a检查file locks总数或open files限制。 in @carl's situation the output was:在@carl 的情况下,output 是:

core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
file size               (blocks, -f) unlimited
max locked memory       (kbytes, -l) unlimited
max memory size         (kbytes, -m) unlimited
open files                      (-n) 256
pipe size            (512 bytes, -p) 1
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 1418
virtual memory          (kbytes, -v) unlimited

as you can see the open files or file locks is equal to 256 :如您所见, open filesfile locks等于256

open files                      (-n) 256

which is a very small value这是一个非常small的值

@carl's archive was at least containing more than 256 files; @carl 的存档至少包含超过 256 个文件; so python was opening each file using a file handler, which then results in a system file lock (in order to open a file on a system you need a file lock, like a pointer to that file; to access data, do whatever you want)所以 python 使用文件处理程序打开每个文件,然后导致system file lock (为了在系统上打开文件,你需要一个文件锁,比如指向该文件的指针;访问数据,做任何你想做的事)

the solution is to make open files value to unlimited or to a very big number.解决方案是使open filesunlimited大或very big

according to this stack answer this is how to you can change the limit根据这个堆栈答案,这是你如何改变限制

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 OSError:[Errno 24]打开的文件太多 - OSError: [Errno 24] Too many open files OSError:[Errno 24]太多打开的文件python,ubuntu - OSError: [Errno 24] Too many open files python , ubuntu OSError: [Errno 24] 打开的文件太多 - OS Mojave - OSError: [Errno 24] Too many open files - OS Mojave OSError: [Errno 24] 使用 Nibabel 打开的文件太多 - OSError: [Errno 24] Too many open files using Nibabel slackclient OSError:[Errno 24]打开的文件太多 - slackclient OSError: [Errno 24] Too many open files Python-OSError 24(打开的文件过多)和共享内存 - Python - OSError 24 (Too many open files) and shared memory __init__ OSError中的文件“ /usr/lib/python3.5/socket.py”,第134行,[Errno 24]打开的文件太多 - File “/usr/lib/python3.5/socket.py”, line 134, in __init__ OSError: [Errno 24] Too many open files OSError: [Errno 24] 通过 Django admin 上传 9000+ csv 个文件时打开的文件太多 - OSError: [Errno 24] Too many open files when uploading 9000+ csv files through Django admin Django从一个图像域保存到另一个图像域:OSError:[Errno 24]打开的文件太多 - Django saving from one imagefield to another: OSError: [Errno 24] Too many open files 多处理“OSError:[Errno 24] 打开的文件太多”:如何清理作业和队列? - Multiprocessing "OSError: [Errno 24] Too many open files": How to clean up jobs and queues?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM