[英]Asyncio and multiprocessing.Process- how to pass a coroutine?
I've got a coroutine which is scraping some data. 我有一个协程,它正在抓取一些数据。 I'm trying to add multiprocessing to speed it up.
我正在尝试添加多处理以加快速度。 I've come up with using multiprocesesing.Process.
我想出了使用multiprocesesing.Process。 My coroutine is called fetch_students_pages
我的协程叫做fetch_students_pages
What I'm trying to do is to pass this coroutine as target to Process class: 我想做的是将此协程作为目标传递给Process类:
def fetch_students_concurrently(self):
for _ in range(10):
Process(target=self.fetch_students_pages).start()
but if fails with: 但如果失败,则:
/usr/lib/python3.7/multiprocessing/process.py:99: RuntimeWarning: coroutine 'StudentPageParser.fetch_students_pages' was never awaited self._target(*self._args, **self._kwargs) RuntimeWarning: Enable tracemalloc to get the object allocation traceback
/usr/lib/python3.7/multiprocessing/process.py:99:RuntimeWarning:协程'StudentPageParser.fetch_students_pages'从未等待过self._target(* self._args,** self._kwargs)RuntimeWarning:启用tracemalloc来获取对象分配回溯
Is there a way to await it or use another solution instead? 有没有办法等待它或改用其他解决方案?
Consider using map()
instead: 考虑使用
map()
代替:
cores = 8
with Pool(cores) as p:
p.map(self.fetch_students_pages, range(cores))
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.