简体   繁体   中英

Python sub processes block when doing blocking read from stdin in main process

I have a Python multiprocessing application which starts "workers" using the multiprocessing API. The main process is itself started by a service process which is not written in Python. The workers may themselves start other non-Python sub process using subprocess.Popen .

For clarity, this is the entire process hierarchy:

  • service.exe: service process (native EXE)
    • python.exe: Python main process (program below)
      • python.exe: Python sub process (task function started by Process)
        • subprocess.exe: Native sub process (see explanation below)

When the service process is stopped, it must tell the Python process to exit. I am using standard input for this. The advantage is that if the service process crashes or is killed, then standard input of the Python process is closed, so it will exit, and there will be no orphan processes.

import multiprocessing
import time
import sys


def task():
    print("Task started...")
    # TODO: Start a native process here using Subprocess.popen
    time.sleep(3)
    print("Task ended")

if __name__ == '__main__':
    process = multiprocessing.Process(target=task)
    process.start()

    # time.sleep(3)  # "workaround"
    sys.stdin.read()

    print("Terminating process...")
    process.terminate()

However, it seems that when I add sys.stdin.read() , the Python sub process starts, but it doesn't do anything. It just seems to hang.

A (bad) workaround was to add time.sleep(3) before reading from standard input. Then the program above works. However, it seems that sub processes started by the Python sub process can still block, and they will block only if I do the blocking read in the main process.

This problem does not occur on all systems. It was observed on one Windows 8 machine and it never occurred on another Windows machine. I am using Python 2.7.2.

My question is: How can a blocking read in the main process affect sub processes? Shouldn't the sub process start and run independently of whatever I do in the main process? (I only want to understand this problem. If you find a better solution for stopping the Python process from the service process, I will be thankful, but it's the strange blocking behavior that is giving me nightmares)

Your subprocesses aren't hanging. One of my favorite debugging techniques to use when I'm using the multiprocessing library is to make the subprocesses drop text files instead of printing to stdout, so you can avoid all of the complications of pipes, such as wondering whether or not your subprocesses inherited the same stdin/stdout, full pipes, etc. If we modify your task to be the following:

def task():
    with open('taskfile.txt', 'w') as fo:
        fo.write("Task started...")
        # TODO: Start a native process here using Subprocess.popen
        time.sleep(3)
        fo.write("Task ended")

It produces the text file 'taskfile.txt' which contains the following:

Task started...Task ended

Therefore, your tasks are running and exiting just fine. Main is just waiting for input from stdin. I suspect you weren't seeing the "Task started..." note because processes launched with multiprocessing.Process() have their own stdin and stdout pipes that aren't connect to the same console as main's.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM