简体   繁体   English

Python多处理-共享内存

[英]Python Multiprocessing - shared memory

  1. From the main process, I create 3 child processes and I pass an instance of a 'common' class...the same instance is passed to all 3 child processes. 在主流程中,我创建了3个子流程,并传递了一个'common'类的实例...同一实例传递给所有3个子流程。
  2. This common class has a dictionary and a queue (which contains a lot of items) 这个普通的类有一个字典和一个队列(其中包含很多项)
  3. These child processes retrieve 'items' from this queue 这些子进程从此队列中检索“项目”
  4. For these items, I call a REST service to get some data about the 'item' 对于这些项目,我调用REST服务以获取有关“项目”的一些数据
  5. I add this info to the "common" dictionary 我将此信息添加到“常用”字典中
  6. There are no errors 没有错误

However, when I try to access this dictionary from the main process, its empty. 但是,当我尝试从主进程访问此字典时,它为空。

For sharing state between processes, the most flexible way is to use a multiprocessing.Manager . 为了在进程之间共享状态,最灵活的方法是使用multiprocessing.Manager You can also use eg multiprocessing.Array , but that can only contain data of the same type. 您也可以使用例如multiprocessing.Array ,但是只能包含相同类型的数据。

Thanks for your responses. 感谢您的回复。

I managed to solve the problem using multiprocessing.Manager().dict() 我设法使用multiprocessing.Manager()。dict()解决了这个问题

Is this the best way....I'm not entirely sure. 这是最好的方法吗?...我不确定。 I think I need to read a lot more on Multiprocessing. 我想我需要阅读更多有关Multiprocessing的内容。

What I find challenging is the fact that the multiprocessing module offers a lot of functionality and as a beginner, its quite challenging to know the right 'tools' to use for the job. 我发现具有挑战性的事实是,多处理模块提供了许多功能,并且作为一个初学者,要知道用于工作的正确“工具”非常具有挑战性。

I started off with threads...then moved to Processes....then used Queues....then used Managers. 我从线程开始...然后移至“进程” ....然后使用“队列” ....然后使用“管理器”。 But as I read more on this subject, I see Python has a lot more to offer. 但是,当我阅读有关该主题的更多内容时,我发现Python可以提供更多功能。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM