简体   繁体   中英

Multiprocessing in Python creates single process

I have an issue with multiprocessing in python, refer to my code below:

processArray = []
downtimeObj = []
for x in range(0,len(clientMaster)):
    downtimeObj.append(Downtime(clientMaster[x]))
    processArray.append(multiprocessing.Process(target=downtimeObj[x].live(),))
    for j in range(len(processArray)):
        processArray[j].start()
    for z in range(len(processArray)):
        processArray[z].join()

Here I have an array of processes "processArray" and an array of objects of class Downtime. In my client master, im trying to iterate through the number of clients I have and create an object for each client, thereby creating a process for each client.

The current code creates a single process and runs the live() function chronologically.

I wish for all clients to have separate processes and the object.live() function to run simultaneously.

You should change indentation and first run loop which only creates all processes. Next loop which runs .start() . And finally loop which use .join()

And Process() (similar to Thread() ) needs function's name without () so later it can use () to start in in new process.

target=downtimeObj[x].live

If you use with () then it runs live() in current process and send its result as argument for Process() like

result = downtimeObj[x].live()
Process(target=result)

You could also learn to use for -loop without range(len()) - it will be more readable.

processArray = []
downtimeObj = []

# --- loop ---

for item in clientMaster:
    obj = Downtime(item)
    downtimeObj.append(obj)
    p = multiprocessing.Process(target=obj.live)
    processArray.append(p)

# --- after loop ---

for p in processArray:
    p.start()

for p in processesArray:
    p.join()

OR you should at least .join() run outside first loop

processArray = []
downtimeObj = []

# --- loop ---

for item in clientMaster:
    obj = Downtime(item)
    downtimeObj.append(obj)
    p = multiprocessing.Process(target=obj.live)
    processArray.append(p)
    p.start()

# --- after loop ---

for p in processesArray:
    p.join()

EDIT:

If you need to send arguments then use tuple ie. (value1, value2)

Process(target=obj.live, args=(value1, value2))

and will run live(value1, value2)

For single argument you also need tuple ie. (value1,)
It needs , inside ( ) to create tuple with single element.

Process(target=obj.live, args=(value1,))

and will run live(value1)

You can see it also in documentation Process


EDIT:

If you want to use Pool to always runs only 5 processes and get all results

from multiprocessing import Pool

# --- function for Process ---

def my_function(item, arg1, arg2):
    obj = Downtime(item)
    return obj.live(arg1, arg2)

# --- create arguments for all processes ---

arguments = []

for item in clientMaster:
    arguments.append( (item, value1, value2) )

# --- use Pool and wait for all results ---

with Pool(5) as p:
     results = p.starmap(my_function, arguments)

# --- display all results ---

print(results)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM