简体   繁体   中英

Sharing data structures between 2 Python processes

I have 2 applications/process/scripts running with Python 3: there is some easy RPC mechanism to share lists, tuples and data structures between this 2 separate processes ?

To be precise this 2 processes are working locally, on the same machine, but a generic solution that can also work with remote processes will be highly appreciated.

If you start up the processes using multiprocessing, you're able to share Array and Value variables across the process boundary.

Check out this python doc page on using shared memory and multiprocessing

...

 from multiprocessing import Process, Value, Array def f(n, a): n.value = 3.1415927 for i in range(len(a)): a[i] = -a[i] if __name__ == '__main__': num = Value('d', 0.0) arr = Array('i', range(10)) p = Process(target=f, args=(num, arr)) p.start() p.join() print(num.value) print(arr[:]) 

...

Here value can be

'c': ctypes.c_char,  'u': ctypes.c_wchar,
'b': ctypes.c_byte,  'B': ctypes.c_ubyte,
'h': ctypes.c_short, 'H': ctypes.c_ushort,
'i': ctypes.c_int,   'I': ctypes.c_uint,
'l': ctypes.c_long,  'L': ctypes.c_ulong,
'f': ctypes.c_float, 'd': ctypes.c_double

and so the initialization will be according to it.

but params num and args are mandatory .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM