[英]Using 'spawn' to start a redis process but facing TypeError: can't pickle _thread.lock objects
I have to use 'spawn' to start process, cause i need to transport cuda tensor between processes.我必须使用“spawn”来启动进程,因为我需要在进程之间传输 cuda 张量。 But using 'spawn' to create redis process always facing TypeError: can't pickle _thread.lock objects
但是使用 'spawn' 创建 redis 进程总是面临 TypeError: can't pickle _thread.lock objects
for some reason this code delete some part由于某种原因,此代码删除了某些部分
it seems that only use 'fork' could work fine似乎只使用“fork”就可以正常工作
import redis
from torch.multiprocessing import Process
class Buffer(Process):
def __init__(self, name=0, num_peers=2, actor_queue=0, communicate_queue=0):
Process.__init__(self)
#some arguments
self.actor_queue = actor_queue
self.communicate_queue = communicate_queue
pool = redis.ConnectionPool(host='localhost', port=6379, decode_responses=True)
self.r = redis.Redis(connection_pool=pool)
self.r.flushall()
async def write(self, r):
#do sth
async def aggregate(self, r):
#do sth
def run(self):
name_process = mp.current_process().name + str(mp.current_process().pid)
print('starting...', name_process)
loop = asyncio.get_event_loop()
asyncio.set_event_loop(loop)
tasks = asyncio.gather(
loop.create_task(self.write(self.r)),
loop.create_task(self.aggregate(self.r)),
)
try:
loop.run_until_complete(tasks)
finally:
loop.close()
if __name__ == '__main__':
mp.set_start_method('spawn')
queue = mp.Queue(maxsize=5)
queue.put('sth')
name = 'yjsp'
num_peers = 2
p =Buffer(name, num_peers, queue, c_queue)
p.start()
problem solved!问题解决了!
we should define pool and other things in run()我们应该在 run() 中定义池和其他东西
Here is the reason: thread live inside process and the process spin up child process to enable parallel.原因如下:线程存在于进程内部,进程启动子进程以启用并行。 threads need locks to keep resources problems away like multiple process acquire same resources and cause dead lock.
线程需要锁来避免资源问题,就像多个进程获取相同的资源并导致死锁一样。
If we define pool in run(), we are already in child process when we get into run() method.如果我们在 run() 中定义池,当我们进入 run() 方法时,我们已经在子进程中。
just like this像这样
def run(self):
pool = redis.ConnectionPool(host='localhost', port=6379, decode_responses=True)
r = redis.Redis(connection_pool=pool)
r.flushall()
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.