简体   繁体   中英

What's the best way to share a dictionary between multiple Python processes?

All processes would read from and write to this dictionary. I want operations on shared dictionary to be as fast as possible (something like under 50 micro seconds).

Sharing dictionary using multiprocessing.Manager isn't fast enough for me.

50μs is pretty lenient.

Can you treat the dictionary keys as if they were in a memcached datastore ?

This 2009 benchmarking comparison shows python-memcache to be more than fast enough - http://amix.dk/blog/post/19471

There's also dogpile.cache - http://pypi.python.org/pypi/dogpile.cache - which can be backed with memcached.

Not sure what the speed hits would be in converting from strings to python objects.

Just using the usual mutex and locks I would think, no ? lock.acquire(), lock.release() API from 'threading' library should help, http://docs.python.org/3.3/library/threading.html

But I don't think you'll win more speed as your design seems to be limited by the Python Global Interpreter Lock (GIL) which may be the roadblock here.

I may be wrong-

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM