简体   繁体   English

非阻塞 multiprocessing.connection.Listener?

[英]Non-blocking multiprocessing.connection.Listener?

I use multiprocessing.connection.Listener for communication between processes, and it works as a charm for me.我使用 multiprocessing.connection.Listener 进行进程之间的通信,它对我来说很有魅力。 Now i would really love my mainloop to do something else between commands from client.现在我真的很喜欢我的主循环在来自客户端的命令之间做其他事情。 Unfortunately listener.accept() blocks execution until connection from client process is established.不幸的是 listener.accept() 会阻止执行,直到建立来自客户端进程的连接。

Is there a simple way of managing non blocking check for multiprocessing.connection?是否有一种简单的方法来管理 multiprocessing.connection 的非阻塞检查? Timeout?超时? Or shall i use a dedicated thread?还是我应该使用专用线程?

    # Simplified code:

    from multiprocessing.connection import Listener

    def mainloop():
        listener = Listener(address=(localhost, 6000), authkey=b'secret')

        while True:
            conn = listener.accept() # <---  This blocks!
            msg = conn.recv() 
            print ('got message: %r' % msg)
            conn.close()

One solution that I found (although it might not be the most "elegant" solution is using conn.poll . ( documentation ) Poll returns True if the Listener has new data, and (most importantly) is nonblocking if no argument is passed to it. I'm not 100% sure that this is the best way to do this, but I've had success with only running listener.accept() once, and then using the following syntax to repeatedly get input (if there is any available)我发现的一种解决方案(尽管它可能不是最“优雅”的解决方案是使用conn.poll 。( 文档)如果 Listener 有新数据,则 Poll 返回True ,并且(最重要的是)如果没有参数传递给它,则为非阻塞. 我不是 100% 确定这是最好的方法,但我只运行listener.accept()一次就成功了,然后使用以下语法重复获取输入(如果有任何可用的)

from multiprocessing.connection import Listener

def mainloop():
    running = True

    listener = Listener(address=(localhost, 6000), authkey=b'secret')
    conn = listener.accept()
    msg = ""

    while running:
        while conn.poll():
            msg = conn.recv() 
            print (f"got message: {msg}")

            if msg == "EXIT":
                running = False

        # Other code can go here
        print(f"I can run too! Last msg received was {msg}")

     conn.close()

The 'while' in the conditional statement can be replaced with 'if,' if you only want to get a maximum of one message at a time.如果您只想一次最多获取一条消息,则条件语句中的“while”可以替换为“if”。 Use with caution, as it seems sort of 'hacky,' and I haven't found references to using conn.poll for this purpose elsewhere.谨慎使用,因为它看起来有点“hacky”,而且我还没有在其他地方找到将conn.poll用于此目的的参考资料。

I've not used the Listener object myself- for this task I normally use multiprocessing.Queue ;我自己没有使用过 Listener 对象——对于这个任务,我通常使用multiprocessing.Queue doco at the following link: doco 在以下链接:

https://docs.python.org/2/library/queue.html#Queue.Queue https://docs.python.org/2/library/queue.html#Queue.Queue

That object can be used to send and receive any pickle-able object between Python processes with a nice API;该对象可用于通过良好的 API 在 Python 进程之间发送和接收任何可腌制的对象; I think you'll be most interested in:我想你最感兴趣的是:

  • in process A进程A中
    • .put('some message')
  • in process B进程B中
    • .get_nowait() # will raise Queue.Empty if nothing is available- handle that to move on with your execution

The only limitation with this is you'll need to have control of both Process objects at some point in order to be able to allocate the queue to them- something like this:唯一的限制是您需要在某个时候控制两个 Process 对象,以便能够将队列分配给它们 - 如下所示:

import time
from Queue import Empty
from multiprocessing import Queue, Process


def receiver(q):
    while 1:
        try:
            message = q.get_nowait()
            print 'receiver got', message
        except Empty:
            print 'nothing to receive, sleeping'
            time.sleep(1)


def sender(q):
    while 1:
        message = 'some message'
        q.put('some message')
        print 'sender sent', message
        time.sleep(1)


some_queue = Queue()

process_a = Process(
    target=receiver,
    args=(some_queue,)
)

process_b = Process(
    target=sender,
    args=(some_queue,)
)

process_a.start()
process_b.start()

print 'ctrl + c to exit'
try:
    while 1:
        time.sleep(1)
except KeyboardInterrupt:
    pass

process_a.terminate()
process_b.terminate()

process_a.join()
process_b.join()

Queues are nice because you can actually have as many consumers and as many producers for that exact same Queue object as you like (handy for distributing tasks).队列很好,因为对于完全相同的 Queue 对象,您实际上可以拥有尽可能多的消费者和生产者(对于分发任务很方便)。

I should point out that just calling .terminate() on a Process is bad form- you should use your shiny new messaging system to pass a shutdown message or something of that nature.我应该指出,仅在 Process 上调用.terminate()是不好的形式 - 您应该使用闪亮的新消息系统来传递关闭消息或类似性质的消息。

The multiprocessing module comes with a nice feature called Pipe().多处理模块带有一个很好的特性,叫做 Pipe()。 It is a nice way to share resources between two processes(never tried more than two before).这是在两个进程之间共享资源的好方法(以前从未尝试过两个以上)。 With the dawn of python 3.80 came the shared memory function in the multiprocessing module but i have not really tested that so i cannot vouch for it You will use the pipe function something like随着python 3.80的出现,多处理模块中的共享内存功能出现了,但我还没有真正测试过,所以我不能保证它你将使用管道功能

from multiprocessing import Pipe

.....

def sending(conn):
    message = 'some message'
    #perform some code
    conn.send(message)
    conn.close()

receiver, sender = Pipe()
p = Process(target=sending, args=(sender,))
p.start()
print receiver.recv()   # prints "some message"
p.join()

with this you should be able to have separate processes running independently and when you get to the point which you need the input from one process.有了这个,您应该能够让单独的进程独立运行,并且当您到达需要来自一个进程的输入的程度时。 If there is somehow an error due to the unrelieved data of the other process you can put it on a kind of sleep or halt or use a while loop to constantly check pending when the other process finishes with that task and sends it over如果由于其他进程的未释放数据而出现某种错误,您可以将其置于某种睡眠或暂停状态,或使用 while 循环在其他进程完成该任务并将其发送过来时不断检查挂起

while not parent_conn.recv():
    time.sleep(5)

this should keep it in an infinite loop until the other process is done running and sends the result.这应该将其保持在无限循环中,直到另一个进程完成运行并发送结果。 This is also about 2-3 times faster than Queue.这也比队列快 2-3 倍。 Although queue is also a good option personally I do not use it.尽管队列也是一个不错的选择,但我个人不使用它。

您可以在线程中运行阻塞函数:

conn = await loop.run_in_executor(None, listener.accept)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM