简体   繁体   中英

Accessing Webpages Using Proxies In Python

I am new in python. I am doing web scraping. I use googlesearch module to get links in python. But after so many requests google blocks my ip. So I used tor and then by using socks I do the same task but the same thing happened again. Now I come to the solution that I should use proxies. But when I made a request using proxies it throws an exception. Below is the code which I use. I run chrome browser by setting proxies manually and it works very well but why it is throwing exception when I access using python.

import requests
    proxies = {'http': 'socks5://user:pass@host:port',
               'https': 'socks5://user:pass@host:port'}
    resp = requests.get('http://https://www.google.com', proxies=proxies )

Please check your URL. the URL should be https://www.google.com/ but your code has the wrong URL. other than that if you still get an error try adding user-agents with the request.

import requests

proxies = {
    'http': 'socks5://user:pass@host:port',
    'https': 'socks5://user:pass@host:port'
}
headers = {
    "User-Agent": "Mozilla/5.0 (X11Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.103 Safari/537.36"
}
resp = requests.get('https://www.google.com', proxies=proxies, headers=headers)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM