from googlesearch import search
def get_results(req):
response = search(req, num_results=10, proxy='3.8.203.84:3128')
links = []
for result in response:
links.append(result)
return links
for i in range(100):
print(get_results('google'))
I tried to use proxy and ratelimits. With proxy problem doesn't disappear and with ratelimits it is working a very long time.
Google has rate limiting which avoid too many requests spamming their servers and overloading their services, this is why you get such error. What I would suggest is sleeping between requests.
You can use:
import time
time.sleep(<n_seconds>)
so your code would look like:
import time
for i in range(100):
print(get_results('google'))
time.sleep(1)
which will sleep you program for 1 second between requests.
There are more advanced/smarted ways to do this using libraries like https://pypi.org/project/limit/ but for now I would try the sleep method and see if it works and as you get more advanced you can make it better.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.