I'm trying to build a basic proxy checker utility in python. This is what I have right now:
import requests
from bs4 import BeautifulSoup
currentip=""
originalip=""
isProxied=False
proxies=["104.236.54.196:8080", "187.62.191.3:61456", "138.204.179.162:44088", "91.216.66.70:32306"]
proxy_count = len(proxies)
url = "https://www.ipchicken.com/"
r = requests.get(url)
def statement():
global currentip
global originalip
print("Current ip is: "+currentip)
print("Your true ip is: "+originalip)
def main(req):
global currentip
soup = BeautifulSoup(req.content, "html.parser")
html = soup.html
body = html.body
font = body.find_all('font')
ip_container = font[0].b
ip = ip_container.contents[0]
currentip=ip
main(r)
originalip=currentip
statement()
print("\n\n")
print("testing proxies...")
print("\n\n")
for x in range(proxy_count):
proxyContainer={"http":"http://"+proxies[x]}
r2 = requests.get(url, proxies=proxyContainer, timeout=20)
print("proxy: " + proxies[x])
main(r2)
statement()
print("\n\n")
if (currentip==originalip):
print("Proxy failed.")
else:
print("This proxy works")
print("\n")
The code runs fine and the requests are made, but they seem to not be proxied. Here is my output:
Current ip is:
199.229.249.163
Your true ip is:
199.229.249.163
testing proxies...
proxy: 104.236.54.196:8080
Current ip is:
199.229.249.163
Your true ip is:
199.229.249.163
Proxy failed.
proxy: 187.62.191.3:61456
Current ip is:
199.229.249.163
Your true ip is:
199.229.249.163
Proxy failed.
proxy: 138.204.179.162:44088
Current ip is:
199.229.249.163
Your true ip is:
199.229.249.163
Proxy failed.
proxy: 91.216.66.70:32306
Current ip is:
199.229.249.163
Your true ip is:
199.229.249.163
Proxy failed.
I have tested these proxies in a separate program and they seem to work fine, I don't think the proxies are the issue.
If you connect to encrypted url https
then you have to set proxy for https
connections but you set proxy only for http
so it doesn't use proxy.
Problem is to find working proxy.
I took from https://hidemy.name/en/proxy-list/?type=s#list but I don't know how long it will work.
And to test IP I used httpbin.org which returns data as JSON so it is easy to display or convert to Python's dictionary.
import requests
url = "https://httpbin.org/ip"
proxies = {
#"http": '141.125.82.106:80',
"https": '141.125.82.106:80',
}
r = requests.get(url, proxies=proxies)
print(r.text)
ip = r.json()["origin"]
print('IP:', ip)
BTW: other problem can be that some proxy sends your IP in extra header and servers may get it - so not all proxies are anonymouse.
EDIT: Version with https://www.ipchicken.com/
import requests
from bs4 import BeautifulSoup
def get_ip(request):
soup = BeautifulSoup(request.content, "html.parser")
return soup.find('font').b.contents[0]
url = "https://www.ipchicken.com/"
proxies = {
#"http": '141.125.82.106:80',
"https": '141.125.82.106:80',
}
r = requests.get(url, proxies=proxies)
ip = get_ip(r)
print(ip)
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.