简体   繁体   English

如何在没有锁定的情况下在python中的多个进程之间共享字典

[英]How to share a dictionary between multiple processes in python without locking

I need to share a huge dictionary (around 1 gb in size) between multiple processs, however since all processes will always read from it. 我需要在多个进程之间共享一个巨大的字典(大小约为1 GB),但是因为所有进程都将始终从中读取。 I dont need locking. 我不需要锁定。

Is there any way to share a dictionary without locking? 有没有办法在没有锁定的情况下共享字典?

The multiprocessing module in python provides an Array class which allows sharing without locking by setting python中的多处理模块提供了一个Array类,它允许通过设置进行共享而不需要锁定
lock=false 锁定= FALSE
however There is no such option for Dictionary provided by manager in multiprocessing module. 但是在多处理模块中管理器提供的词典没有这样的选项。

Well, in fact the dict on a Manager has no locks at all! 嗯,实际上管理员的字典根本就没有锁! I guess this is true for the other shared object you can create through the manager too. 我想对于你可以通过管理器创建的其他共享对象也是如此。 How i know this? 我怎么知道这个? I tried: 我试过了:

from multiprocessing import Process, Manager

def f(d):
    for i in range(10000):
        d['blah'] += 1

if __name__ == '__main__':
    manager = Manager()

    d = manager.dict()
    d['blah'] = 0
    procs = [ Process(target=f, args=(d,)) for _ in range(10) ]
    for p in procs:
        p.start()
    for p in procs:
        p.join()

    print d

If there were locks on d , the result would be 100000 . 如果d上有锁,则结果为100000 But instead, the result is pretty random and so this is just a nice illustration why locks are needed when you modify stuff ;-) 但相反,结果是相当随机的,所以这只是一个很好的例子,为什么修改东西时需要锁;-)

So just go ahead and use manager.dict() . 所以请继续使用manager.dict()

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM