简体   繁体   中英

How to update a dictionary in multiprocessing Python

Thank you in advance for your help.

I am trying to update a dictionary in a python code which is paralleled with multiprocessing. Unfortunately, I cannot share my code, because it is quite complicated but I will try to explain my problem.

First of all, I don't have a shared dictionary and for some reason, the dictionary should be created in every process separately. The problem will arise when I add an element to this dictionary. every time that I add a new key: value to the dictionary, all values of the previous keys become equal to the new value. Something like this...

def func():
    my_dict = {}
    my_dict['first'] = 1
    my_dict['second'] = 2
    print(my_dict) 

import multiprocessing as mp

pool = mp.Pool(2)
pool.map(func)


The results will be something like this

my_dict = {'first':2, 'second':2}

I also used Manager().dict() but I encounter several problems so I decided to ask first.

Thanks again for your time.

my_dict is not shared or sync between processes in your example, I think it should be okay. I run

def func(_):
    my_dict = {}
    my_dict['first'] = 1
    my_dict['second'] = 2
    print(my_dict)

import multiprocessing as mp

pool = mp.Pool(2)
pool.map(func, range(2))

For python 3.8.10 on Ubuntu 18.04.5, it outputs

{'first': 1, 'second': 2}
{'first': 1, 'second': 2}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM