简体   繁体   中英

Python: urllib2 and proxies

I'm trying to write a script in Python that reloads a page every x seconds using a list of proxies, and I'm having an issue at the moment. I know it's not the proxies' faults either, because I can ping them and they return fine. They are HTTP proxies. My script returns this error to me:

urllib.error.URLError: <urlopen error [WinError 10061] No connection could be made because the target machine actively refused it>

I have no idea how to fix it. Here is the actual script:

import urllib.request
import time
proxy_list = input("Name of proxy list file?: ")
proxy_file = open(proxy_list, 'r')
url = input("URL to bot? (Has to include http://): ")
sleep = float(input("Time between reloads? (In seconds, 0 for none): "))
proxies = []
for line in proxy_file:
    proxies.append( line )
proxies = [w.replace('\n', '') for w in proxies]

while True:
    for i in range(len(proxies)):
        proxy = proxies[i]
        proxy2 = {"http":"http://%s" % proxy}
        proxy_support = urllib.request.ProxyHandler(proxy2)

        opener = urllib.request.build_opener(proxy_support)
        urllib.request.install_opener(opener)
        urllib.request.urlopen(url).read()
        time.sleep(float(sleep))

Thanks.

Don't use urllib2 . Seriously, just don't.

Your holy grail: requests .

what you're trying to do is then:

while True:
    for proxy in proxies:
        r = request.get(my_url, proxies={'http': proxy})
        print r.text
        time.sleep(float(sleep))

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM