[英]How to combine multiprocessing with asyncio
I have url checker code to check its response 200 or not but I want make it asynchronous along with multiprocessing but i stuck in the code plz help me build the code.我有 url 检查器代码来检查其响应 200 与否,但我想让它与多处理一起异步,但我卡在代码中,请帮助我构建代码。
import aiohttp
import asyncio
from aiomultiprocess import Pool
# ================================================================================
async def fetch(session,url):
try:
async with aiohttp.ClientSession() as session:
async with session.get(url) as response:
if response.status != 200:
response.raise_for_status()
return {'url':url, 'action':'success', 'status_code':response.status}
except:
return {'url':url, 'action':'error', 'status_code':404}
async def fetch_all(session, urls):
tasks = []
for url in urls:
task = asyncio.create_task(fetch(session, url))
tasks.append(task)
results = await asyncio.gather(*tasks)
return results
calling the function调用 function
async with aiohttp.ClientSession() as session:
dataset = await fetch_all(session, urls)
The easiest way would be to use Pool.starmap()
:最简单的方法是使用
Pool.starmap()
:
async def fetch_all(session, urls):
async with Pool() as pool:
starargs = [(session, url) for url in urls]
results = await pool.starmap(fetch, starargs)
return results
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.