[英]Restrict the number of processors used in multiprocessing
For my code, I need to use multiprocessing
module in Python to implement parallelism in the code. 对于我的代码,我需要在Python中使用
multiprocessing
模块以在代码中实现并行性。 I have written the following code for that: 我为此编写了以下代码:
for j in range(0, len(filters)):
p = multiprocessing.Process(target=task, args=(filters[j],j+1,img,i+1,fname))
p.start()
processes.append(p)
for j in range(0, len(filters)):
p.join()
The above code works fine but it uses all the available processors in the system. 上面的代码工作正常,但是它使用了系统中所有可用的处理器。
For ex: If I have 16 processors, it uses all the 16 processors in the system. 例如:如果我有16个处理器,它将使用系统中的所有16个处理器。
Is there any way by which I can control/limit the number of processors used by the MultiProcessing module ? 有什么方法可以控制/限制MultiProcessing模块使用的处理器数量?
You should use multiprocessing.Pool - it gives you a pool of a certain size. 您应该使用multiprocessing.Pool-它为您提供一定大小的池。
processes = []
with Pool(processes=4) as pool:
for j in range(0, len(filters)):
p = pool.apply_async(target=task, args=(filters[j],j+1,img,i+1,fname))
processes.add(p)
for result in processes:
print('\t', result.get())
The full documentation is here . 完整的文档在这里 。
This has the added benefit that you are not starting a new process for each task, but reuse the same ones. 这具有额外的好处,即您不必为每个任务启动新的流程,而是重复使用相同的流程。 Given that starting a process is expensive, you will get better performance.
鉴于开始一个过程很昂贵,您将获得更好的性能。
The number of processes to guess is not trivial - it depends wether your work is CPU bound, I/O bound and what load is on your PC from other programs. 要猜测的进程数量并非易事-这取决于您的工作是否受CPU限制,I / O限制以及其他程序对PC造成的负担。 If you are CPU bound, you can get the number of cores like this:
如果您受CPU限制,则可以获得以下内核数:
multiprocessing.cpu_count()
You should probably choose a value less than that, eg -2 to leave space for other work, but that's- just a guess. 您可能应该选择一个小于该值的值,例如-2以便为其他工作留出空间,但这只是一个猜测。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.