简体   繁体   English

使用 python 为同一 url 发送多个 get 请求

[英]Send multiple get requests for the same url using python

I have to send a couple of thousand get requests for the same url.我必须发送几千个获取相同 url 的请求。 It's quite long when I do that by using for loop, so I looking a better/faster/ ̶s̶t̶r̶o̶n̶g̶e̶r̶- solution.当我使用 for 循环执行此操作时会很长,所以我寻找更好/更快/ ̶s̶t̶r̶o̶n̶g̶e̶r̶- 的解决方案。 Do you have any ideas?你有什么想法?

url = 'https://site.ru:port/api/v2/getRequest'

for index, row in tqdm.tqdm(data.iterrows(), total=data.shape[0]):

        params = {

            'param1_key' : row['param1_value'],
            'param2_key' : row['param2_value'}

        response = requests.get(api_endpoint, params, headers={'apikey': api_key}, timeout=30)

When you do it in a for loop, you have to wait for a response each time.在 for 循环中执行此操作时,每次都必须等待响应。 So to make it better/ faster /stronger ... Do it async with requests-futures or grequests !所以为了让它更好/更快/更强......与requests-futuresgrequests异步!

(Also see Asynchronous Requests with Python requests - it's pretty dated though and points you to grequests anyway) (另请参阅带有 Python 请求的异步请求- 虽然它已经过时了,但无论如何都指向 grequests)

Hard to say without more info.没有更多信息很难说。 iterrows if knows not to be fast, but I would bet a coin that the most time consuming part is waiting for the response from the server. iterrows if 知道不快,但我敢打赌,最耗时的部分是等待服务器的响应。

In that case multiprocessing.dummy.Pool can be a handy tool to start multiple concurrent requests without waiting for each to terminate before starting the next one.在这种情况下, multiprocessing.dummy.Pool可以成为启动多个并发请求的便捷工具,而无需等待每个请求都终止后再开始下一个请求。 But beware, sending too many requests to a single server can be seen as an attack...但请注意,向单个服务器发送过多请求可能被视为攻击......

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM