[英]Python multiprocessing program not running to the end
I am new to multiprocessing in python. 我是python中的多处理新手。
Basically my problem scenario is that I want to run my python script parallely on set of tables say 2 tables. 基本上,我的问题场景是我想在一组表(例如2个表)上并行运行python脚本。
Here my python script reads the data from each of the tables parallely and then write the data from each of these tables into another table. 在这里,我的python脚本并行地从每个表中读取数据,然后将这些表中的数据写入另一个表中。
I have written the following code snippet to create a multiprocess python script. 我已经编写了以下代码片段来创建多进程python脚本。 However, when I run the script it does not complete and neither does it throw any error message.
但是,当我运行脚本时,它没有完成,也没有抛出任何错误消息。
count = multiprocessing.cpu_count()
pool = multiprocessing.Pool(processes=count)
args = [ ('yelp','localhost:9160','cassa1','flight88'), ('yelp','localhost:9160','cassa1','flight96') ]
for a in args:
print a
pool.apply_async(user_input,a)
Appreciate help on this as I am confused as well as stuck here. 对此我深感困惑,并深陷其中,对此不胜感激。
Your script exits before child processes finish their tasks. 您的脚本在子进程完成其任务之前退出。 Add at the end:
在末尾添加:
pool.close() # no more tasks
pool.join() # wait for the remaining tasks to complete
Also, you could use pool.imap*()
methods instead: 另外,您可以改用
pool.imap*()
方法:
from multiprocessing import Pool
def safe_user_input(args):
try:
return user_input(*args), None
except Exception as e:
return None, str(e)
if __name__=="__main__":
tables = [
('yelp','localhost:9160','cassa1','flight88'),
('yelp','localhost:9160','cassa1','flight96')
]
pool = Pool() # use all available CPUs
for result, error in pool.imap_unordered(safe_user_input, tables):
if error is None: # no error
print(result)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.