简体   繁体   中英

urllib2 timeout

i'm using urllib2 library for my code, i'm using a lot of (urlopen) EDIT: loadurl

i have a problem on my network, when i'm browsing sites, sometimes my browser gets stuck on "Connecting" to a certain website and sometimes my browser returns a timeout

My question is if i use urllib2 on my code it can timeout when trying to connect for too long to a certain website or the code will get stuck on that line.

i know that urllib2 can handle timeouts without specifying it on code but it can apply for this kind of situation ?

Thanks for your time

EDIT :

def checker(self)
 try:
   html = self.loadurl("MY URL HERE")
   if self.ip_ != html:
(...)
 except Exeption, error:
   html = "bad"

from my small research, the urllib2.urlopen() function is added in python 2.6

so, the timeout problem should be resolved by sending custom timeout to the urllib2.urlopen function. the code should look like this ;

response = urllib2.urlopen( "---insert url here---", None, your-timeout-value)

the your-timeout-value parameter is an optional parameter which defines the timeout in seconds.

EDIT : According to your comment, I got that you don't need the code for waiting too long then you should have the following code to not get stuck;

import socket
import urllib2

socket.setdefaulttimeout(10)

10 can be changed according to a math formula related to the connection speed & website loading time.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM