[英]Python: How to use different logfiles for processes in multiprocessing.Pool?
[英]How to keep certain processes in a multiprocessing.Pool() from printing to stdout in Python?
现在我有以下情况:
def take_time(time):
sleep(time)
print("Took %d seconds!" %time)
def multip(num_cores, data):
p = multiprocessing.Pool(num_cores)
p.map(take_time, data)
end_time = time()
print("total time taken: %d" %(end_time - start_time))
假设num_cores = 2
且data = (1,1,3)
。
>>> multi(4, data)
Took 1 seconds!
Took 1 seconds!
Took 3 seconds!
如何呈现它,以便一次只有一个进程进入标准输出? 只要不同时打印两个,就没有特别关系。 假设首先让与data[1]
相对应的过程打印出来。 然后,所需的输出将是
>>> multi(4, data)
Took 1 seconds!
Took 3 seconds!
我猜想有一种比map()
更完善的工具。 谢谢!
一种策略是仅从一个正在运行的线程进行写入,并将其他线程输出到队列。 这样做需要您的主线程异步启动其他线程(例如使用map_async ),然后从队列中读取直到完成。 以您的示例为例,您将执行以下操作:
q = multiprocessing.Queue()
def take_time(time):
sleep(time)
q.put("Took %d seconds!" %time)
def multip(num_cores, data):
p = multiprocessing.Pool(num_cores)
start_time = time()
result = p.map_async(take_time, data)
while not result.ready():
print q.get()
end_time = time()
print("total time taken: %d" %(end_time - start_time))
有关如何应用此方法的真实示例, 请查看 docker-compose log
命令的代码 ,该命令读取可能多个容器化进程的输出并将它们组合为一个流。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.