[英]python 3 - urllib issue
我在Windows 7中使用python 3.3.0。我有两个文件:dork.txt和fuzz.py
dork.txt包含以下内容:
/about.php?id=1
/en/company/news/full.php?Id=232
/music.php?title=11
fuzz.py包含以下内容:
srcurl = "ANY-WEBSITE"
drkfuz = open("dorks.txt", "r").readlines()
print("\n[+] Number of dork names to be fuzzed:",len(drkfuz))
for dorks in drkfuz:
dorks = dorks.rstrip("\n")
srcurl = "http://"+srcurl+dorks
requrl = urllib.request.Request(srcurl)
#httpreq = urllib.request.urlopen(requrl)
# Starting the request
try:
httpreq = urllib.request.urlopen(requrl)
except urllib.error.HTTPError as e:
print ("[!] Error code: ", e.code)
print("")
#sys.exit(1)
except urllib.error.URLError as e:
print ("[!] Reason: ", e.reason)
print("")
#sys.exit(1)
#if e.code != 404:
if httpreq.getcode() == 200:
print("\n*****srcurl********\n",srcurl)
return srcurl
因此,当我输入具有/about.php?id=1
的正确网站名称时,它可以正常工作。 但是当我提供具有/en/company/news/full.php?Id=232
的网站时,它首先打印Error code: 404
然后给我以下错误: UnboundLocalError: local variable 'e' referenced before assignment
或UnboundLocalError: local variable 'httpreq' referenced before assignment
我可以理解,如果网站没有包含/about.php?id=1
的页面,它会给出Error code: 404
但为什么它不会在for
循环中返回以检查文本文件中的剩余dork? ?? 为什么它停在这里并抛出错误?
我想制作一个脚本,从网站地址找到有效页面,如: www.xyz.com
当行urllib.request.urlopen(requrl)
表达式抛出异常时,永远不会设置变量httpreq
。 你可以在try
语句之前将它设置为None
,然后测试它是否仍然是None
:
httpreq = None
try:
httpreq = urllib.request.urlopen(requrl)
# ...
if httpreq is not None and httpreq.getcode() == 200:
srcurl = "ANY-WEBSITE"
drkfuz = open("dorks.txt", "r").readlines()
print("\n[+] Number of dork names to be fuzzed:",len(drkfuz))
for dorks in drkfuz:
dorks = dorks.rstrip("\n")
srcurl = "http://"+srcurl+dorks
try:
requrl = urllib.request.Request(srcurl)
if requrl != None and len(requrl) > 0:
try:
httpreq = urllib.request.urlopen(requrl)
if httpreq.getcode() == 200:
print("\n*****srcurl********\n",srcurl)
return srcurl
except:
# Handle exception
pass
except:
# Handle your exception
print "Exception"
未经测试的代码,但它将在逻辑上工作。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.