簡體   English   中英

多處理時打開文件過多錯誤

[英]too many files open error with multiprocessing

我有一個代碼在 Ubuntu 上的 12 核 vcpu 上對大約 10000 個文件使用多處理。

def process_file(name):
    inp = open(name)
    out = open(name.split('.')[0]+'wikiout.txt','a')
    for row in inp:
        row = row.strip()
        sent_text = nltk.sent_tokenize(text)

        for sent in sent_text:
            # process sentence
    
        inp.close()
        out.close()

if __name__ == '__main__':
    processes = []
    for i in 'ABCDEF':
        for j in 'ABCDEFGHIJKLMNOPQRSTUVWXYZ':
            for k in range(100)
                filename = os.path.join(os.path.dirname(__file__), (i + j + '/' + 'wiki_' + str(k) + '.txt'))

                p = multiprocessing.Process(target=process_file, args=(filename,))
                processes.append(p)
                p.start()

    for process in processes:
        process.join()

出於某種原因,我得到了這個問題

  File "wikirules.py", line 37, in <module>
    p.start()
  File "/usr/lib/python3.8/multiprocessing/process.py", line 121, in start
    self._popen = self._Popen(self)
  File "/usr/lib/python3.8/multiprocessing/context.py", line 224, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
  File "/usr/lib/python3.8/multiprocessing/context.py", line 277, in _Popen
    return Popen(process_obj)
  File "/usr/lib/python3.8/multiprocessing/popen_fork.py", line 19, in __init__
    self._launch(process_obj)
  File "/usr/lib/python3.8/multiprocessing/popen_fork.py", line 69, in _launch
    child_r, parent_w = os.pipe()
OSError: [Errno 24] Too many open files
Traceback (most recent call last):
  File "/usr/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap
  File "/usr/lib/python3.8/multiprocessing/process.py", line 108, in run
  File "wikirules.py", line 13, in process_file
  File "/usr/local/lib/python3.8/dist-packages/nltk/tokenize/__init__.py", line 106, in sent_tokenize
  File "/usr/local/lib/python3.8/dist-packages/nltk/data.py", line 752, in load
  File "/usr/local/lib/python3.8/dist-packages/nltk/data.py", line 877, in _open
  File "/usr/local/lib/python3.8/dist-packages/nltk/data.py", line 327, in open
OSError: [Errno 24] Too many open files: '/root/nltk_data/tokenizers/punkt/PY3/english.pickle'

任何線索為什么會發生這種情況? 我對多處理仍然很陌生。 所以這不應該一次打開超過 12 個文件。

您的代碼正在嘗試運行

len('ABCDEF') * len('ABCD...Z') * len(range(100)) = 6 * 26 * 100 = 15 600

操作系統同時處理。

實際上, multiprocessing模塊包含與多處理一起使用的相對較低級別的原語,對於基本任務,標准庫建議更安全和方便的選項 - 模塊concurrent.futures包含線程和進程的池實現,並且可能非常有用,特別是對於“令人尷尬的並行”工作量。

下面是如何使用concurrent.futures和其他一些 python 功能(如生成器、上下文管理器和pathlib模塊)轉換問題中的代碼的示例。

import concurrent.futures as futures
import itertools
import pathlib

import nltk

BASE_PATH = pathlib.Path(__file__).parent.absolute()

def filename_generator():
    """produce filenames sequence"""
    for i, j, k in itertools.product("ABCDEF", "ABCDEFGHIJKLMNOPQRSTUVWXYZ", range(100)):
        yield BASE_PATH / f"{i}{j}/wiki_{k}.txt"

def worker(filename: pathlib.Path):
    """do all the job"""
    out_filename = filename.with_suffix('.wikiout.txt')
    with open(filename) as inp, open(out_filename, "a") as out:
        for row in inp:
            text = row.strip()
            sent_text = nltk.sent_tokenize(text)
            for sent in sent_text:
                """process sentence"""

def main():
    with futures.ProcessPoolExecutor() as pool:
        # mapping future->filename, useful in case of error
        task_to_filename = {pool.submit(worker, f): f for f in filename_generator()}
        for f in futures.as_completed(task_to_filename):
            try:
                f.result()
            except Exception as e:
                filename = task_to_filename[f]
                print(f"{filename} processing failed: {e}")

if __name__ == "__main__":
    main()

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM