简体   繁体   中英

use multiprocessing as local IPC

I'm considering using Python's multiprocessing package for messaging between local python programs.

This seems like the right way to go IF:

  • The programs will always run locally on the same machine (and same OS instance)
  • The programs' implementation will remain in Python
  • Speed is important

Is it possible in case the python processes were run independently by the user, ie one did not spawn the other?

How?
The docs seem to give examples only of cases where one spawns the other.

The programs will always run locally on the same machine (and same OS instance)

Multiprocessing allows to have remote concurrency .

The programs' implementation will remain in Python

Yes and no. You could wrap another command in a python function. This will work, for example:

from multiprocessing import Process
import subprocess

def f(name):
    subprocess.call(["ls", "-l"])

if __name__ == '__main__':
    p = Process(target=f, args=('bob',))
    p.start()
    p.join()

Speed is important

That depends from a number of factors:

  • how much overhead will cause the co-ordination between processes?
  • how many cores does your CPU have?
  • how much disk I/O is required by each process? Do them work on the same physical disk?
  • ...

Is it possible in case the python processes were run independently by the user, ie one did not spawn the other?

I'm not an expert on the subject, but I implemented something similar once by using files to exchange data [basically one process' output file was monitored as input source by the other, and vice-versa].

HTH!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM