简体   繁体   中英

Too many open files error with python multiprocessing

I'm having multiple problems with a python (v3.7) script using multiprocessing (as mp hereafter). One of them is that my computations end with an "OSError: [Errno 24] Too many open files". My scripts and modules are complex, so I've broken down the problem to the following code:

def worker(n):
     time.sleep(1)

n = 2000

procs = [mp.Process(target=worker, args=(i,)) for i in range(n)]
nprocs = 40
i = 0

while i<n:
    if (len(mp.active_children())<=nprocs):
        print('Starting proc {:d}'.format(i))
        procs[i].start()
        i += 1
    else:
        time.sleep(1)
        
[p.join() for p in procs]

This code fails when approx ~ 1020 processes have been excecuted. I've always used multiprocessing in a similar fashion without running into this problem, I'm running this on a serveur with ~ 120 CPU. Lately I've switch from Python 2.7 to 3.7, I don't know if that can be an issue.

Here's the full trace:

Traceback (most recent call last):
  File "test_toomanyopen.py", line 18, in <module>
    procs[i].start()
  File "/p/jqueryrel/local_install/conda_envs/trois/lib/python3.7/multiprocessing/process.py", line 112, in start
    self._popen = self._Popen(self)
  File "/p/jqueryrel/local_install/conda_envs/trois/lib/python3.7/multiprocessing/context.py", line 223, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
  File "/p/jqueryrel/local_install/conda_envs/trois/lib/python3.7/multiprocessing/context.py", line 277, in _Popen
    return Popen(process_obj)
  File "/p/jqueryrel/local_install/conda_envs/trois/lib/python3.7/multiprocessing/popen_fork.py", line 20, in __init__
    self._launch(process_obj)
  File "/p/jqueryrel/local_install/conda_envs/trois/lib/python3.7/multiprocessing/popen_fork.py", line 69, in _launch
    parent_r, child_w = os.pipe()
OSError: [Errno 24] Too many open files

I've seen a similar issue here , but I don't see how I can solve this.

Thanks

To put the comments into an answer, several options to fix this:

  • Increase the limit of possible open file handles. Edit /etc/security/limits.conf . Eg see here .
  • Don't spawn so many processes. If you have 120 CPUs, it doesn't really make sense to spawn more than 120 procs.
    • Maybe using Pool might be helpful to restructure your code.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM