简体   繁体   中英

Python Multiprocessing Async Can't Terminate Process

I have an infinite loop running async but I can't terminate it. Here is a similiar version of my code :

from multiprocessing import Pool
test_pool = Pool(processes=1)
self.button1.clicked.connect(self.starter)
self.button2.clicked.connect(self.stopper)

    def starter(self):
       global test_pool
       test_pool.apply_async(self.automatizer)

    def automatizer(self):
       i = 0
       while i != 0 :
          self.job1()
          # safe stop point
          self.job2()
          # safe stop point
          self.job3()
          # safe stop point

    def job1(self):
       # doing some stuff


    def job2(self):
       # doing some stuff


    def job3(self):
       # doing some stuff


    def stopper(self):
       global test_pool
       test_pool.terminate()

My problem is terminate() inside stopper function doesn't work. I tried to put terminate() inside job1,job2,job3 functions still not working, tried putting at the end of the loop in starter function, again not working. How can I stop this async process ?

While stopping the process at anytime is good enough, is it possible to make it stop at the points I want ? I mean if a stop command (not sure about what command it is) is given to process, I want it to complete the steps to "# safe stop point" marker then terminate the process.

You really should be avoiding the use of terminate() in normal operation. It should only be used in unusual cases, such as hanging or unresponsive processes. The normal way to end a process pool is to call pool.close() followed by pool.join() .

These methods do require the function that your pool is executing to return, and your call to pool.join() will block your main process until it does so. I would suggest you add a multiprocess.Queue to give yourself a way to tell your subprocess to exit:

# this import is NOT the same as multiprocessing.Queue - this is here for the 
# queue.Empty exception
import Queue

queue = multiprocessing.Queue() # not the same as a Queue.Queue()

def stopper(self):
   # don't need "global" keyword to call a global object's method
   # it's only necessary if we want to modify a global
   queue.put("Stop") 
   test_pool.close()
   test_pool.join()

def automatizer(self):
    while True: # cleaner infinite loop - yours was never executing
        for func in [self.job1, self.job2, self.job3]: # iterate over methods
            func() # call each one

            # between each function call, check the queue for "poison pill"
            try:
                if queue.get(block=False) == "Stop":
                    return
            except Queue.Empty:
                pass

Since you didn't provide a more complete code sample, you'll have to figure out where to actually instantiate the multiprocessing.Queue and how to pass things around. Also, the comment from Janne Karila was correct. You should switch your code to use a single Process instead of a pool if you're only using one process at a time anyway. The Process class also uses a blocking join() method to tell it to end once it has returned . The only safe way to end processes at "known safe points" is to implement some kind of interprocess communication like I've done here. Pipes would work as well.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM