简体   繁体   中英

Unable to share values between processes in Multiprocessing python

I tried to run two scripts using multiprocessing as follows:

import multiprocessing
files=["file1","file2"]

if __name__ == "__main__":
    for name in files:
        process = multiprocessing.Process(target=lambda: __import__(name))
        process.start()

I'm not able to pass the arguments while initiating the process because I get the error as lambda function accepts 0 arguments, but 2 were given .

I need to share a variable's data between the scripts. Any suggestions on how to resolve this.

Note: The functions in scripts were not able to run when the functions are imported and executed in multiprocessing. Hence I choose this approach.

you need to pass the function name as a target, not the file.

# Importing functions

from service.file import get_service 
from service.file2 import get_another

from multiprocessing import process


def run_in_parallel(functions:list):
    proc = []
    for fn in fns:
        p = Process(target=fn, args=(data))
        p.start()
        proc.append(p)

if __name__=='__main__':
    func = [get_service, get_another]
    run_in_parallel(func)

create a list of a process named proc here, starting process one by one and inserting it into proc.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM