I have a shared multiprocessing.Pool
object in my application which is initialized with two queue objects (one for jobs and the other for results).
How can I send an arbitrary queue object into the job queue and have the process send the result to this alternative queue?
job_q.put_nowait((item, alt_q)) # Raises an exception.
This approach works fine when doing multithreading but not when doing multiprocessing.
The example code below demonstrates what I am trying to achieve. I initialize the pool with two multiprocessing.Queue objects job_q
and res_q
. Well, in fact, they are proxies created by multiprocessing.Manager. The run
function is the run-loop of each process, it monitors the job queue for items and simply adds the items to the result queue. (A separate thread is monitoring the result queue and print to stdout).
import multiprocessing as mp
import queue
import threading
import time
import os
def run(job_queue, result_queue):
""" Run-loop for each process.
"""
print("Starting process {}".format(os.getpid()))
while True:
job_q = job_queue
res_q = result_queue
try:
# `item` is just a string
# `opt_queue` is an optional result queue to use
item, opt_queue = job_q.get(True, 0.05)
if opt_queue is not None:
res_q = opt_queue
item = item + " Processed"
res_q.put_nowait(item)
except queue.Empty:
continue
def monitor_queue(mp_queue):
""" The target of a monitoring thread.
"""
while True:
try:
item = mp_queue.get(True, 0.05)
print("Got `{}`".format(item))
except queue.Empty:
continue
if __name__ == '__main__':
m = mp.Manager()
job_q = m.Queue()
res_q = m.Queue()
alt_q = m.Queue()
# Monitor `res_q` for items
threading.Thread(target=monitor_queue, args=(res_q,)).start()
# Monitor `alt_q` for items
threading.Thread(target=monitor_queue, args=(alt_q,)).start()
# `run` is called by each process, share `job_q` and `res_q` with all processes
pool = mp.Pool(2, run, (job_q, res_q))
time.sleep(1)
# Add an item to `job_q` and `None` means send result to `res_q`
print('Putting first item into the job queue')
job_q.put_nowait(('#1', None)) # prints... Got `#1`
time.sleep(1)
# Add an item to `job_q` and send result to `alt_q`
print('Putting second item into the job queue and passing alternative result queue')
job_q.put_nowait(('#2', alt_q)) # TypeError: AutoProxy() got an unexpected keyword argument 'manager_owned'
pool.close()
pool.terminate()
This exits with the error
Putting second item into the job queue and passing alternative result queue
Traceback (most recent call last):
File "/Users/daniel/Desktop/pydebug/mp_example.py", line 54, in <module>
job_q.put_nowait(('#1', alt_q)) # TypeError: AutoProxy() got an unexpected keyword argument 'manager_owned'
File "<string>", line 2, in put_nowait
File "/usr/local/Cellar/python3/3.6.1/Frameworks/Python.framework/Versions/3.6/lib/python3.6/multiprocessing/managers.py", line 772, in _callmethod
raise convert_to_error(kind, result)
multiprocessing.managers.RemoteError:
---------------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/local/Cellar/python3/3.6.1/Frameworks/Python.framework/Versions/3.6/lib/python3.6/multiprocessing/managers.py", line 228, in serve_client
request = recv()
File "/usr/local/Cellar/python3/3.6.1/Frameworks/Python.framework/Versions/3.6/lib/python3.6/multiprocessing/connection.py", line 251, in recv
return _ForkingPickler.loads(buf.getbuffer())
File "/usr/local/Cellar/python3/3.6.1/Frameworks/Python.framework/Versions/3.6/lib/python3.6/multiprocessing/managers.py", line 881, in RebuildProxy
return func(token, serializer, incref=incref, **kwds)
TypeError: AutoProxy() got an unexpected keyword argument 'manager_owned'
---------------------------------------------------------------------------
我认为您无法在消息中传递队列,因为它不可序列化。
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.