I need help in python 2.7, with read data from local web server with command "urllib2.Request" I have use code ( attached below ) and work fine, as long as the server is accessible (192.168.5.44). If the server is unavailable, then the code stops with an error
URLError: <urlopen error [Errno 113] No route to host>
what should I add in the code so that in the event of an error, it will continue and add value of read ( example N/A or something..... ) I have couple of server like this, and all stops if one server is down.
My code:
request = urllib2.Request("http://192.168.5.44")
fip44 = urllib2.urlopen(request)
time.sleep(1)
sip44 = fip44.read()
def beri44( sip44, first, last ):
start = sip44.index( first ) + len( first )
end = sip44.index( last, start )
return sip44[start:end]
jed = float (beri44( sip44, "[", "]" ))
fjed = open("/var/www/html/Temp/jed.txt", "w")
fjed.write(str(jed))
fjed.close()
print"jed:", jed
Use a try
and except
block to avoid stopping the program due to any error, as follows:
try:
request = urllib2.Request("http://192.168.5.44")
fip44 = urllib2.urlopen(request)
time.sleep(1)
sip44 = fip44.read()
def beri44( sip44, first, last ):
start = sip44.index( first ) + len( first )
end = sip44.index( last, start )
return sip44[start:end]
jed = float (beri44( sip44, "[", "]" ))
fjed = open("/var/www/html/Temp/jed.txt", "w")
fjed.write(str(jed))
fjed.close()
print"jed:", jed
except:
# Do whatever you want here in the case of an error occurred
pass
Hope it helps.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.