[英]python aiohttp performance: connect performed on the main thread
I have the following code 我有以下代码
import asyncio
import aiohttp
urls = [
'http://54.224.27.241',
'http://54.224.27.241',
'http://54.224.27.241',
'http://54.224.27.241',
'http://54.224.27.241',
]
async def query(urls):
out = []
with aiohttp.ClientSession() as session:
for url in urls:
try:
async with session.get(url, timeout=5) as resp:
text = await resp.text()
out.append(resp.status)
except:
print('timeout')
return out
loop = asyncio.get_event_loop()
out = loop.run_until_complete(query(urls))
loop.close()
print(str(out))
The code is much slower than the one that uses a threadpool and keep increasing if you increase the number of urls (lets say 20, 50 etc.) 该代码比使用线程池的代码慢得多,如果您增加URL的数量,该代码会不断增加(比如说20、50等)。
I have a feeling that when the initial connection establishment is not done in an async way. 我有一种感觉,当初始连接建立不是以异步方式完成的。
(Note that I am connecting here to an non-existing server to deliberately produce a connection timeout). (请注意,我在这里连接到不存在的服务器以故意产生连接超时)。
Can someone point out what is wrong here? 有人可以指出这里有什么问题吗?
Warning: I don't promise this code works, as I can't install aiohttp
atm, but looking at the example in the docs 警告:由于无法安装
aiohttp
atm,因此我不保证此代码有效,但aiohttp
文档中的示例
async def fetch(session, url):
async with async_timeout.timeout(10):
async with session.get(url) as response:
return await response.text()
async def main():
async with aiohttp.ClientSession() as session:
html = await fetch(session, 'http://python.org')
print(html)
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
Notice how they're calling the aiohttp.ClientSession()
with the async
keyword. 注意他们如何使用
async
关键字调用aiohttp.ClientSession()
。 Additionally, I was getting some error in your line data = await async with session.get(url) as resp:
另外,我在您的行
data = await async with session.get(url) as resp:
遇到了一些错误data = await async with session.get(url) as resp:
async def fetch(session, url):
async with session.get(url) as response:
return await response.text()
async def main():
out = []
async with aiohttp.ClientSession() as session:
for url in urls:
data = await fetch(session, url)
out.append(data)
return out
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.