简体   繁体   中英

Python multiprocessing loses activity without exiting file

I have a problem where my.py file, which uses maximum CPU through multiprocessing, stops operating without exiting the.py file.

I am running a heavy task that uses all cores in an old MacBook Pro (2012). The task runs fine at first, where I can visually see four python3.7 tasks populate the Activity Monitor window. However, after about 20 minutes, those four python3.7 disappear from the Activity Monitor.

The strangest part is the multiprocessing.py file is still operating, ie it never threw an uncaught exception nor exited the file.

Would you guys/gals have any ideas as to what's going on? My guess is 1) it's most likely an error in the script, and 2) the old computer is overheating.

Thanks!

Edit: Below is the multiprocess code, where the multiprocess function to execute is func with a list as its argument. I hope this helps!

import multiprocessing

def main():
    pool = multiprocessing.Pool()
    for i in range(24):
        pool.apply_async(func, args = ([], ))
    pool.close()
    pool.join()

if __name__ == '__main__':
    main()

Use a context manager to handle closing processes properly.

from multiprocessing import Pool

def main():
    with Pool() as p:
        result = p.apply_async(func, args = ([], ))
        print(result)

if __name__ == '__main__':
    main()

I wasn't sure what you were doing with the for i in range() part.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM