[英]Python functions in a separate process. Can i unify the wrapper functions
我是Python的新手,我想知道如何在語法上更有效地實現以下問題。 我有函數f1,f2 ... fN這些函數是包裝器,它們產生新的進程(目標為_f1,_f2,.. _fN),並將其參數(arg1,arg2,...)傳遞給子進程並接收返回值值
使用這樣的代碼,我希望模塊功能在與調用方(模塊的用戶)過程不同的過程中執行。
函數f1,f2,...,fN(分別為_f1,f2,...,_ fN)可能具有不同的原型。
in a module
def _f1(arg1, arg2, ... argn, connection):
...
connection.send(return_value)
connection.close()
def f1(arg1, arg2, ... argn):
parent_conn, child_conn = Pipe()
p = Process(target=_f1, args=(arg1, arg2, ... argn, child_conn))
p.start()
p.join()
return parent_conn.recv()
def _f2(arg1, arg2, ... argm, connection):
...
connection.send(return_value)
connection.close()
def f2(arg1, arg2, ... argn):
parent_conn, child_conn = Pipe()
p = Process(target=_f2, args=(arg1, arg2, ... argm, child_conn))
p.start()
p.join()
return parent_conn.recv()
...
def _fn(arg1, arg2, ... argk, connection):
...
connection.send(return_value)
connection.close()
def fN(arg1, arg2, ... argn):
parent_conn, child_conn = Pipe()
p = Process(target=_fN, args=(arg1, arg2, ... argk, child_conn))
p.start()
p.join()
return parent_conn.recv()
很明顯,包裝函數f1,f2,fN大致相同。 我可以將它們實現為單個包裝器功能嗎? 我希望執行不被阻塞。 例如,模塊的用戶應該能夠同時執行f1和f2。
我希望我能解釋我的問題。
這是帶有兩個函數sum()和sin()的具體示例:
def _sum(a, b, connection):
return_value=a+b
connection.send(return_value)
connection.close()
def sum(a, b):
parent_conn, child_conn = Pipe()
p = Process(target=_sum, args=(a, b, child_conn))
p.start()
p.join()
return parent_conn.recv()
def _sin(x, connection):
return_value=sin(x)
connection.send(return_value)
connection.close()
def sin(x):
parent_conn, child_conn = Pipe()
p = Process(target=_sin, args=(x, child_conn))
p.start()
p.join()
return parent_conn.recv()
帶着關於使用裝飾的高級想法,我來到了下面發布的解決方案。 我試圖進一步擴展它來裝飾connection.send(return_value)和connection.close(),但是它對我不起作用。 下面的代碼。 我用評論指出了什么是有效的,什么等效物(我認為)是無效的。 有什么幫助嗎?
from multiprocessing import Process, Pipe
def process_wrapper1(func):
def wrapper(*args):
parent_conn, child_conn = Pipe()
f_args = args + (child_conn,)
p = Process(target=func, args=f_args)
p.start()
p.join()
return parent_conn.recv()
return wrapper
def process_wrapper2(func):
def wrapper(*args):
res=func(*args[0:len(args)-1])
args[-1].send(res)
args[-1].close()
return wrapper
#def _sum(a, b, connection): #Working
# return_value=a+b
# connection.send(return_value)
# connection.close()
def __sum(a, b): #Doesn't work, see the error bellow
return(a+b)
_sum=process_wrapper2(__sum)
sum=process_wrapper1(_sum)
Pyzo ipython shell中的以上代碼生成以下結果:
In [3]: import test1
In [4]: test1.sum(2,3)
---------------------------------------------------------------------------
PicklingError Traceback (most recent call last)
<ipython-input-4-8c542dc5e11a> in <module>()
----> 1 test1.sum(2,3)
C:\projects\PYnGUInLib\test1.py in wrapper(*args)
11 f_args = (child_conn,) + args
12 p = Process(target=func, args=f_args)
---> 13 p.start()
14 p.join()
15 return parent_conn.recv()
C:\pyzo2014a_64b\lib\multiprocessing\process.py in start(self)
103 'daemonic processes are not allowed to have children'
104 _cleanup()
--> 105 self._popen = self._Popen(self)
106 self._sentinel = self._popen.sentinel
107 _children.add(self)
C:\pyzo2014a_64b\lib\multiprocessing\context.py in _Popen(process_obj)
210 @staticmethod
211 def _Popen(process_obj):
--> 212 return _default_context.get_context().Process._Popen(process_obj)
213
214 class DefaultContext(BaseContext):
C:\pyzo2014a_64b\lib\multiprocessing\context.py in _Popen(process_obj)
311 def _Popen(process_obj):
312 from .popen_spawn_win32 import Popen
--> 313 return Popen(process_obj)
314
315 class SpawnContext(BaseContext):
C:\pyzo2014a_64b\lib\multiprocessing\popen_spawn_win32.py in __init__(self, process_obj)
64 try:
65 reduction.dump(prep_data, to_child)
---> 66 reduction.dump(process_obj, to_child)
67 finally:
68 context.set_spawning_popen(None)
C:\pyzo2014a_64b\lib\multiprocessing\reduction.py in dump(obj, file, protocol)
57 def dump(obj, file, protocol=None):
58 '''Replacement for pickle.dump() using ForkingPickler.'''
---> 59 ForkingPickler(file, protocol).dump(obj)
60
61 #
PicklingError: Can't pickle <function process_wrapper2.<locals>.wrapper at 0x0000000005541048>: attribute lookup wrapper on test1 failed
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\pyzo2014a_64b\lib\multiprocessing\spawn.py", line 106, in spawn_main
exitcode = _main(fd)
File "C:\pyzo2014a_64b\lib\multiprocessing\spawn.py", line 116, in _main
self = pickle.load(from_parent)
EOFError: Ran out of input
In [5]:
您可以使用裝飾器將函數包裝為創建過程並執行過程的樣板。
def process_wrapper(func):
def wrapper(*args):
parent_conn, child_conn = Pipe()
#attach the connection to the arguments
f_args = args + (child_conn,)
p = Process(target=func, args=f_args)
p.start()
p.join()
return parent_conn.recv()
return wrapper
並將函數定義為
@process_wrapper
def _f2(arg1, arg2, ... argm, connection):
...
connection.send(return_value)
connection.close()
說明: process_wrapper
函數采用具有N個位置參數的函數,最后一個始終是管道連接。 它返回一個帶有N-1個參數的函數,並在其中預填充了連接。
如果您有具體的功能,
@process_wrapper
def sin(x, connection):
return_value=sin(x)
connection.send(return_value)
connection.close()
@process_wrapper
def sum(a, b, connection):
return_value=a+b
connection.send(return_value)
connection.close()
您可以將函數調用為
sum(a,b)
有關python裝飾器的更多參考http://www.jeffknupp.com/blog/2013/11/29/improve-your-python-decorators-explained/
您應該使用multiprocessing.Pool
。 這是一個例子:
def f1(*args):
rv = do_calculations()
return rv
def f2(*args):
...
...
def fN(*args):
...
def worker(args):
fn = args[0]
return fn(*args[1:])
inputs = [
[f1, f1_args],
[f2, f2_args],
...
[fN, fN_args]
]
pool = multiprocessing.Pool(processes=multiprocessing.cpu_count())
results = pool.map(worker, inputs)
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.