简体   繁体   English

在 python 中并行执行阻塞调用

[英]parallelly execute blocking calls in python

I need to do a blocking xmlrpc call from my python script to several physical server simultaneously and perform actions based on response from each server independently.我需要从我的 python 脚本同时对多个物理服务器进行阻塞 xmlrpc 调用,并根据每个服务器的响应独立执行操作。 To explain in detail let us assume following pseudo code为了详细解释,让我们假设以下伪代码

while True:
    response=call_to_server1() #blocking and takes very long time
    if response==this:
        do that

I want to do this for all the servers simultaneously and independently but from same script我想同时独立地为所有服务器执行此操作,但使用相同的脚本

Use the threading module.使用线程模块。

Boilerplate threading code (I can tailor this if you give me a little more detail on what you are trying to accomplish)样板线程代码(如果你给我更多关于你想要完成的细节,我可以定制这个)

def run_me(func):
    while not stop_event.isSet():
      response= func()  #blocking and takes very long time
      if response==this:
         do that

def call_to_server1():
     #code to call server 1...
     return  magic_server1_call()

def call_to_server2():
     #code to call server 2...
     return  magic_server2_call()


#used to stop your loop.   
stop_event = threading.Event()

t = threading.Thread(target=run_me, args=(call_to_server1))
t.start()

t2 = threading.Thread(target=run_me, args=(call_to_server2))
t2.start()

#wait for threads to return.
t.join()
t2.join()

#we are done....

You can use multiprocessing module您可以使用多处理模块

import multiprocessing
def call_to_server(ip,port):
....
....
for i in xrange(server_count):
    process.append( multiprocessing.Process(target=call_to_server,args=(ip,port)))
    process[i].start()
#waiting process to stop
for p in process:
    p.join()

You can use multiprocessing plus queues.您可以使用多处理加队列。 With one single sub-process this is the example:使用一个子流程的示例如下:

import multiprocessing
import time

def processWorker(input, result):
    def remoteRequest( params ):
        ## this is my remote request
        return True
    while True:
        work = input.get()
        if 'STOP' in work:
            break
        result.put( remoteRequest(work) )

input  = multiprocessing.Queue()
result = multiprocessing.Queue()

p = multiprocessing.Process(target = processWorker, args = (input, result))
p.start()
requestlist = ['1', '2']
for req in requestlist:
    input.put(req)
for i in xrange(len(requestlist)):
    res = result.get(block = True)
    print 'retrieved ', res

input.put('STOP')
time.sleep(1)
print 'done'

To have more the one sub-process simply use a list object to store all the sub-processes you start.要拥有多个子流程,只需使用列表 object 来存储您启动的所有子流程。 The multiprocessing queue is a safe object.多处理队列是一个安全的 object。

Then you may keep track of which request is being executed by each sub-process simply storing the request associated to a workid (the workid can be a counter incremented when the queue get filled with new work).然后,您可以跟踪每个子进程正在执行哪个请求,只需存储与 workid 关联的请求(workid 可以是当队列充满新工作时递增的计数器)。 Usage of multiprocessing.Queue is robust since you do not need to rely on stdout/err parsing and you also avoid related limitation. multiprocessing.Queue 的使用是稳健的,因为您不需要依赖 stdout/err 解析并且还避免了相关限制。

Then, you can also set a timeout on how long you want a get call to wait at max, eg:然后,您还可以设置一个超时,以设置您希望 get 呼叫在最大等待多长时间,例如:

import Queue
try:
    res = result.get(block = True, timeout = 10)
except Queue.Empty:
    print error

Use twisted .使用扭曲的 .

It has a lot of useful stuff for work with network.它有很多有用的东西与网络一起工作。 It is also very good at working asynchronously.它也非常擅长异步工作。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM