简体   繁体   English

Python在单独的过程中运行。 我可以统一包装器功能吗

[英]Python functions in a separate process. Can i unify the wrapper functions

I am new to Python and I am wondering how to implement more syntactically efficient the following problem. 我是Python的新手,我想知道如何在语法上更有效地实现以下问题。 I have functions f1, f2 ... fN Those functions are wrappers which spawn new processes (with targets _f1, _f2, .. _fN), pass its argument (arg1, arg2, ...) to the child processes and receive the return values 我有函数f1,f2 ... fN这些函数是包装器,它们产生新的进程(目标为_f1,_f2,.. _fN),并将其参数(arg1,arg2,...)传递给子进程并接收返回值值
With code like this I want the module functionality to execute in a different process then the caller(user of the module) process. 使用这样的代码,我希望模块功能在与调用方(模块的用户)过程不同的过程中执行。
Functions f1, f2, ... fN (respectively _f1, f2, ... _fN) may have different prototypes. 函数f1,f2,...,fN(分别为_f1,f2,...,_ fN)可能具有不同的原型。

in a module

def _f1(arg1, arg2, ... argn,  connection):
    ...
    connection.send(return_value)
    connection.close()
def f1(arg1, arg2, ... argn):
    parent_conn, child_conn = Pipe()
    p = Process(target=_f1, args=(arg1, arg2, ... argn, child_conn))
    p.start()
    p.join() 
    return parent_conn.recv()


def _f2(arg1, arg2, ... argm,  connection):
    ...
    connection.send(return_value)
    connection.close()    
def f2(arg1, arg2, ... argn):
    parent_conn, child_conn = Pipe()
    p = Process(target=_f2, args=(arg1, arg2, ... argm, child_conn))
    p.start()
    p.join() 
    return parent_conn.recv()

...

def _fn(arg1, arg2, ... argk,  connection):
    ...
    connection.send(return_value)
    connection.close()    
def fN(arg1, arg2, ... argn):
    parent_conn, child_conn = Pipe()
    p = Process(target=_fN, args=(arg1, arg2, ... argk, child_conn))
    p.start()
    p.join() 
    return parent_conn.recv()

It is clear that wrapper functions f1,f2, fN are about the same. 很明显,包装函数f1,f2,fN大致相同。 Can I implement them as a single wrapper function probably? 我可以将它们实现为单个包装器功能吗? I want the execution to be not blocking. 我希望执行不被阻塞。 The user of the module should be able to execute concurrently f1 and f2 for example. 例如,模块的用户应该能够同时执行f1和f2。

I hope I have managed to explain my question. 我希望我能解释我的问题。

Here concrete example with two functions sum() and sin(): 这是带有两个函数sum()和sin()的具体示例:

def _sum(a, b,  connection):
   return_value=a+b
   connection.send(return_value)
   connection.close()
def sum(a, b):
   parent_conn, child_conn = Pipe()
   p = Process(target=_sum, args=(a, b, child_conn))
   p.start()
   p.join() 
   return parent_conn.recv()

def _sin(x,  connection):
   return_value=sin(x)
   connection.send(return_value)
   connection.close()    
def sin(x):
   parent_conn, child_conn = Pipe()
   p = Process(target=_sin, args=(x, child_conn))
   p.start()
   p.join() 
   return parent_conn.recv() 

Taking the srj idea about using decoration i came to a solution posted below. 带着关于使用装饰的高级想法,我来到了下面发布的解决方案。 I have tried to expand it even further to decorate also connection.send(return_value) and connection.close() but it doesn't work for me. 我试图进一步扩展它来装饰connection.send(return_value)和connection.close(),但是它对我不起作用。 Below the code. 下面的代码。 With coments I specify what is working and what equivalen (in my opinion) is not working. 我用评论指出了什么是有效的,什么等效物(我认为)是无效的。 Any help? 有什么帮助吗?

from multiprocessing import Process, Pipe

def process_wrapper1(func):
    def wrapper(*args):
        parent_conn, child_conn = Pipe()
        f_args = args + (child_conn,)
        p = Process(target=func, args=f_args)
        p.start()
        p.join() 
        return parent_conn.recv()
    return wrapper

def process_wrapper2(func):
    def wrapper(*args):
        res=func(*args[0:len(args)-1])
        args[-1].send(res)
        args[-1].close()
    return wrapper



#def _sum(a, b,  connection):            #Working 
#   return_value=a+b
#   connection.send(return_value)
#   connection.close()
def __sum(a, b):                       #Doesn't work, see the error bellow
    return(a+b)    
_sum=process_wrapper2(__sum)

sum=process_wrapper1(_sum) 

The above code in the Pyzo ipython shell generates the following result: Pyzo ipython shell中的以上代码生成以下结果:

In [3]: import test1
In [4]: test1.sum(2,3)
---------------------------------------------------------------------------
PicklingError                             Traceback (most recent call last)
<ipython-input-4-8c542dc5e11a> in <module>()
----> 1 test1.sum(2,3)

C:\projects\PYnGUInLib\test1.py in wrapper(*args)
     11         f_args = (child_conn,) + args
     12         p = Process(target=func, args=f_args)
---> 13         p.start()
     14         p.join()
     15         return parent_conn.recv()

C:\pyzo2014a_64b\lib\multiprocessing\process.py in start(self)
    103                'daemonic processes are not allowed to have children'
    104         _cleanup()
--> 105         self._popen = self._Popen(self)
    106         self._sentinel = self._popen.sentinel
    107         _children.add(self)

C:\pyzo2014a_64b\lib\multiprocessing\context.py in _Popen(process_obj)
    210     @staticmethod
    211     def _Popen(process_obj):
--> 212         return _default_context.get_context().Process._Popen(process_obj)
    213 
    214 class DefaultContext(BaseContext):

C:\pyzo2014a_64b\lib\multiprocessing\context.py in _Popen(process_obj)
    311         def _Popen(process_obj):
    312             from .popen_spawn_win32 import Popen
--> 313             return Popen(process_obj)
    314 
    315     class SpawnContext(BaseContext):

C:\pyzo2014a_64b\lib\multiprocessing\popen_spawn_win32.py in __init__(self, process_obj)
     64             try:
     65                 reduction.dump(prep_data, to_child)
---> 66                 reduction.dump(process_obj, to_child)
     67             finally:
     68                 context.set_spawning_popen(None)

C:\pyzo2014a_64b\lib\multiprocessing\reduction.py in dump(obj, file, protocol)
     57 def dump(obj, file, protocol=None):
     58     '''Replacement for pickle.dump() using ForkingPickler.'''
---> 59     ForkingPickler(file, protocol).dump(obj)
     60 
     61 #

PicklingError: Can't pickle <function process_wrapper2.<locals>.wrapper at 0x0000000005541048>: attribute lookup wrapper on test1 failed
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\pyzo2014a_64b\lib\multiprocessing\spawn.py", line 106, in spawn_main
   exitcode = _main(fd)
  File "C:\pyzo2014a_64b\lib\multiprocessing\spawn.py", line 116, in _main
   self = pickle.load(from_parent)
EOFError: Ran out of input

In [5]: 

You could use a decorator to wrap the function with the boilerplate of creating the process and executing it. 您可以使用装饰器将函数包装为创建过程并执行过程的样板。

def process_wrapper(func):
    def wrapper(*args):
        parent_conn, child_conn = Pipe()
        #attach the connection to the arguments
        f_args = args + (child_conn,)
        p = Process(target=func, args=f_args)
        p.start()
        p.join() 
        return parent_conn.recv()
    return wrapper

and define the function as 并将函数定义为

@process_wrapper
def _f2(arg1, arg2, ... argm,  connection):
    ...
    connection.send(return_value)
    connection.close()

explanation : The process_wrapper function takes a function that has N positional arguments, the last of which is always a pipe connection. 说明: process_wrapper函数采用具有N个位置参数的函数,最后一个始终是管道连接。 It returns a function with N-1 arguments, with the connection pre-filled in it. 它返回一个带有N-1个参数的函数,并在其中预填充了连接。

in case of your concrete function, 如果您有具体的功能,

@process_wrapper
def sin(x,  connection):
   return_value=sin(x)
   connection.send(return_value)
   connection.close()  

@process_wrapper
def sum(a, b,  connection):
   return_value=a+b
   connection.send(return_value)
   connection.close()

you could call the function as 您可以将函数调用为

sum(a,b)

More references to python decorators http://www.jeffknupp.com/blog/2013/11/29/improve-your-python-decorators-explained/ 有关python装饰器的更多参考http://www.jeffknupp.com/blog/2013/11/29/improve-your-python-decorators-explained/

You should use multiprocessing.Pool . 您应该使用multiprocessing.Pool Here is an example: 这是一个例子:

def f1(*args):
    rv = do_calculations()
    return rv 

def f2(*args):
    ...

...
def fN(*args):
    ...

def worker(args):
    fn = args[0]
    return fn(*args[1:])

inputs = [
    [f1, f1_args],
    [f2, f2_args],
    ...
    [fN, fN_args]
]

pool = multiprocessing.Pool(processes=multiprocessing.cpu_count())
results = pool.map(worker, inputs)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM