![](/img/trans.png)
[英]Proxies Requests is using my own ip and when I'm using https I'm getting error
[英]Requests keeping my IP address whereas I'm using Tor
當我執行python請求時,我注意到這是發送給請求的實際IP地址,而我使用Tor設置了一個新IP地址。 這是我的代碼:
from torrequest import TorRequest
tr = TorRequest(proxy_port=9050, ctrl_port=9051, password=r"mypassword")
response = tr.get('http://ipecho.net/plain')
proxies = {'http': "socks5://"+response.text+":9050"}
page_response = requests.get('https://www.google.com/search?&q=Apple', timeout=60, verify=False, headers={'User-Agent': random.choice(user_agents)}, proxies=proxies)
soup = BeautifulSoup(page_response.content, 'html.parser')
但是,谷歌意識到這仍然是我的IP地址,而不是Tor生成的IP地址。 怎么會?
您如何知道Google在使用代理時可以獲取您的實際IP? 您使用的代理可能會被Google阻止,或者代理在首次連接時超時。
要了解這些可能的原因,您可以這樣編寫代碼->
from torrequest import TorRequest
tr = TorRequest(proxy_port=9050, ctrl_port=9051, password=r"mypassword")
response = tr.get('http://ipecho.net/plain')
proxies = {'http': "socks5://"+response.text+":9050"}
# Using this check, you will know weather your proxies are working or not.
# if proxy for request and current ip are same than proxy is working
try:
print "The proxy for request is {0}".format(response.text)
proxy_check = requests.get('http://icanhazip.com', timeout=60, proxies=proxies)
print "Proxy is {0}".format(proxy_check)
except requests.exceptions.RequestException as e:
print e
# we should catch request exception to check any exception raise from requests like
# timeout
try:
page_response = requests.get('https://www.google.com/search?&q=Apple', timeout=60, verify=False, headers={'User-Agent': random.choice(user_agents)}, proxies=proxies)
except requests.exceptions.RequestException as e:
print e
soup = BeautifulSoup(page_response.content, 'html.parser')
現在,您可以知道要使用哪個IP來訪問Google。 如果代理是好的,那么谷歌很可能已經阻止了該代理。
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.