简体   繁体   English

在使用 Python 多处理的子进程内创建子进程失败

[英]Create child processes inside a child process with Python multiprocessing failed

I observed this behavior when trying to create nested child processes in Python.我在尝试在 Python 中创建嵌套子进程时观察到了这种行为。 Here is the parent program parent_process.py :这是父程序parent_process.py

import multiprocessing
import child_process

pool = multiprocessing.Pool(processes=4)
for i in range(4):
        pool.apply_async(child_process.run, ())
pool.close()
pool.join()

The parent program calls the "run" function in the following child program child_process.py:父程序调用下面子程序child_process.py中的“run”函数:

import multiprocessing

def run():
        pool = multiprocessing.Pool(processes=4)
        print 'TEST!'
        pool.close()
        pool.join()

When I run the parent program, nothing was printed out and the program exited quickly.当我运行父程序时,什么也没有打印出来,程序很快就退出了。 However, if print 'TEST!'但是,如果print 'TEST!' is moved one line above (before the nested child processes are created), 'TEST!'移到上方一行(在创建嵌套子进程之前), 'TEST!' are printed for 4 times.打印了 4 次。

Because errors in a child process won't print to screen, this seems to show that the program crashes when a child process creates its own nested child processes.因为子进程中的错误不会打印到屏幕上,这似乎表明当子进程创建自己的嵌套子进程时程序崩溃了。

Could anyone explain what happens behind the scene?谁能解释一下幕后发生了什么? Thanks!谢谢!

According to the documentation for multiprocessing , daemonic processes cannot spawn child processes.根据multiprocessing的文档,守护进程不能产生子进程。

multiprocessing.Pool uses daemonic processes to ensure that they don't leak when your program exits. multiprocessing.Pool使用守护进程来确保它们在您的程序退出时不会泄漏。

As noxdafox said, multiprocessing.Pool uses daemonic processes.正如 noxdafox 所说, multiprocessing.Pool使用守护进程。 I found a simple workaround that uses multiprocess.Process instead:我找到了一个使用multiprocess.Process的简单解决方法:

Parent program:家长计划:

import multiprocessing
import child_process

processes = [None] * 4
for i in range(4):
    processes[i] = multiprocessing.Process(target=child_process.run, args=(i,))
    processes[i].start()
for i in range(4):
    processes[i].join()

Child program (with name child_process.py ): child_process.py (名称为child_process.py ):

import multiprocessing

def test(info):
    print 'TEST', info[0], info[1]

def run(proc_id):
    pool = multiprocessing.Pool(processes=4)
    pool.map(test, [(proc_id, i) for i in range(4)])
    pool.close()
    pool.join()

The output is 16 lines of TEST :输出是TEST 16 行:

TEST 0 0
TEST 0 1
TEST 0 3
TEST 0 2
TEST 2 0
TEST 2 1
TEST 2 2
TEST 2 3
TEST 3 0
TEST 3 1
TEST 3 3
TEST 3 2
TEST 1 0
TEST 1 1
TEST 1 2
TEST 1 3

I do not have enough reputation to post a comment, but since python version determines the options for running hierarchical multiprocessing (eg, a post from 2015 ), I wanted to share my experience.我没有足够的声誉发表评论,但由于 python 版本决定了运行分层多处理的选项(例如, 2015 年的帖子),我想分享我的经验。 The above solution by Da Kuang worked for me with python 3.7.1 running through Anaconda 3. Da Kuang 的上述解决方案对我使用 python 3.7.1 运行通过 Anaconda 3。

I made a small modification to child_process.py to make it run the cpu for a little while so I could check system monitor to verify 16 simultaneous processes were running.我对 child_process.py 做了一个小的修改,让它运行 cpu 一段时间,这样我就可以检查系统监视器以验证 16 个同时运行的进程。

import multiprocessing

def test(info):
    print('TEST', info[0], info[1])
    aa=[1]*100000
    a=[1 for i in aa if all([ii<1 for ii in aa])]
    print('exiting')

def run(proc_id):
    pool = multiprocessing.Pool(processes=4)
    pool.map(test, [(proc_id, i) for i in range(4)])
    pool.close()
    pool.join()

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM