简体   繁体   中英

python3 multiprocessing example crashed my pc :(

I am new to multiprocessing

I have run example code for two 'highly recommended' multiprocessing examples given in response to other stackoverflow multiprocessing questions. Here is an example of one (which i dare not run again!)

test2.py (running from pydev)

import multiprocessing

class MyFancyClass(object):

    def __init__(self, name):
        self.name = name

    def do_something(self):
        proc_name = multiprocessing.current_process().name
        print(proc_name, self.name)


def worker(q):
    obj = q.get()
    obj.do_something()



queue = multiprocessing.Queue()

p = multiprocessing.Process(target=worker, args=(queue,))
p.start()

queue.put(MyFancyClass('Fancy Dan'))

# Wait for the worker to finish
queue.close()
queue.join_thread()
p.join()

When I run this my computer slows down imminently. It gets incrementally slower. After some time I managed to get into the task manager only to see MANY MANY python.exe under the processes tab. after trying to end process on some, my mouse stopped moving. It was the second time i was forced to reboot.
I am too scared to attempt a third example...

running - Intel(R) Core(TM) i7 CPU 870 @ 2.93GHz (8 CPUs), ~2.9GHz on win7 64

If anyone know what the issue is and can provide a VERY SIMPLE example of multiprocessing (send a string too a multiprocess, alter it and send it back for printing) I would be very grateful.

From the docs :

Make sure that the main module can be safely imported by a new Python interpreter without causing unintended side effects (such a starting a new process).

Thus, on Windows, you must wrap your code inside a

if __name__=='__main__':

block.


For example, this sends a string to the worker process, the string is reversed and the result is printed by the main process:

import multiprocessing as mp

def worker(inq,outq):
    obj = inq.get()
    obj = obj[::-1]
    outq.put(obj)

if __name__=='__main__':
    inq = mp.Queue()
    outq = mp.Queue()

    p = mp.Process(target=worker, args=(inq,outq))
    p.start()

    inq.put('Fancy Dan')

    # Wait for the worker to finish
    p.join()
    result = outq.get()
    print(result)

Because of the way multiprocessing works on Windows (child processes import the __main__ module) the __main__ module cannot actually run anything when imported -- any code that should execute when run directly must be protected by the if __name__ == '__main__' idiom. Your corrected code:

import multiprocessing

class MyFancyClass(object):

    def __init__(self, name):
        self.name = name

    def do_something(self):
        proc_name = multiprocessing.current_process().name
        print(proc_name, self.name)


def worker(q):
    obj = q.get()
    obj.do_something()


if __name__ == '__main__':
    queue = multiprocessing.Queue()

    p = multiprocessing.Process(target=worker, args=(queue,))
    p.start()

    queue.put(MyFancyClass('Fancy Dan'))

    # Wait for the worker to finish
    queue.close()
    queue.join_thread()
    p.join()

Might I suggest this link ? It's using threads, instead of multiprocessing, but many of the principles are the same.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM