I used the python multiprocessing module to test some proxies.They worked well at the beginning.However, after a few more minutes, they became slower.Then I checked the task manager,and I found out that there were only 2 subprocesses left.Later , all processes stopped , even though their taskes were not done!
May I ask why T^T
#coding:utf-8
import urllib2
import re
import cookielib
import time
import urllib
import multiprocessing
h = {
'Connection' : 'keep-alive' ,
'Accept' : '*/*' ,
'User_Agent' : 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.118 Safari/537.36' ,
}
f_r=open('ip2.txt','r')
ips=f_r.readlines()
#print rows
#print ips
q = multiprocessing.Queue()
for ip in ips:
q.put(ip.replace('\n' , ''))
def worker(q):
global h
while not q.empty():
ip = q.get()
proxy_ip='http://'+ip
print proxy_ip
proxy = urllib2.ProxyHandler( { 'http' : proxy_ip } )
cj = cookielib.CookieJar()
cookie_support = urllib2.HTTPCookieProcessor(cj)
opener = urllib2.build_opener(cookie_support, urllib2.HTTPHandler)
opener.add_handler( proxy )
urllib2.install_opener(opener)
try:
urllib2.urlopen(urllib2.Request('http://www.zhihu.com' ,headers=h),timeout=3)
print ip+' OK!!!'
with open('ip_canuse.txt','a') as f_w:
f_w.write(ip + '\n')
break
except Exception,e:
print e
continue
if __name__ == '__main__':
ps=[]
for i in range(10):
ps.append(multiprocessing.Process(target = worker, args = (q,)))
for p in ps:
p.daemon = True
p.start()
for p in ps:
p.join()
print "end"
原来,我在那里添加了不必要的“中断”,这引起了问题。
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.