简体   繁体   中英

Python simultaneously filling up a list

So I am trying to write multiprocess code that will hopefully fill up a list based on some processes that are running. But it's not modifying the list at all.

Now I know I can't access the same element and increment it from multiple threads because it will lead to race conditions. But what I have is code that accesses ONE index from ONE and only one process. So that for example, if I have a list with 4 elements, I run 4 processes, one process for each element. This, however, doesn't work. Even though I read that lists are supposedly thread-safe.

I wrote a small program demonstrating my issue:

from multiprocessing import Process

list = [0,0,0,0]

def incrAt(idx):
    list[idx] += 1


p0 =  Process(target = incrAt, args=(0,))
p1 =  Process(target = incrAt, args=(1,))
p2 =  Process(target = incrAt, args=(2,))
p3 =  Process(target = incrAt, args=(3,))

p0.start()
p1.start()
p2.start()
p3.start()

# Do stuff while we wait...

p0.join()
p1.join()
p2.join()
p3.join()


print(list) # should print [1,1,1,1] but prints [0,0,0,0]

That's because global variables are not shared between processes.

Use multiprocessing.Manager.list -

from multiprocessing import Process, Manager

def incrAt(idx, lis):
    lis[idx] += 1
with Manager() as manager:
    lis = manager.list([0, 0, 0, 0])
    p0 =  Process(target = incrAt, args=(0,lis))

Renamed your list from list to lis as list is python's inbuilt

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM