简体   繁体   中英

My Python program unexpectedly quit when I use python-requests in sub process

In my spider project, I have a code paragraph that is to crawl "sina weibo" the hottest topic link which will feed my spiders. It work perfectly when I single test it.But, the code paragraph lead to python unexpected to quit when I use them in Process. I found the failure reason is that I use python-requests in the code paragraph.So, when I rewrite it by urllib3, it work normally.

This code running in my macOS Mojava. Python version is "3.7" and python-requests version is "2.21.0".

"""
The run_spider function periodically crawls the link and feed to the spiders
"""
@staticmethod
def run_spider():
    try:
        cs = CoreScheduler()
        while True:
            cs.feed_spider()
            first_time = 3 * 60
            while not cs.is_finish():
                time.sleep(first_time)
                first_time = max(10, first_time // 2)
            cs.crawl_done()
            time.sleep(SPIDER_INTERVAL)
    except Exception as e:
        print(e)
"""
The cs.feed_spider() just crawl and parse the page, it will return a generator of links. The code is shown below.
"""
def get_page(self):
    headers = {
        'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
        'Accept-Language': 'zh-cn',
        'Host': 's.weibo.com',
        'Accept-Encoding': 'br, gzip, deflate',
        "User-Agent": 'Mozilla/5.0 (iPhone; CPU iPhone OS 11_3) AppleWebKit/605.1.15\
         (KHTML, like Gecko) Version/11.0 Mobile/15E148 Safari/604.1',

        }
    # res = requests.get(self.TARGET_URL, headers=headers)
    http = urllib3.PoolManager()
    res = http.request("GET", self.TARGET_URL, headers=headers)
    if 200 == res.status:
        return res.data
    else:
        return None
"""
The crawler will become a child process. like below.
"""
def run(self):
    spider_process = Process(target=Scheduler.run_spider)
    spider_process.start()

I expect using python-requests would work, but it caused the program to quit unexpectedly. When I rewrite the code using urllib3, the program runs fine. I don't understand why.

You started the process, but I can't see you waiting for it. The join() function will cause the main thread to pause executing until the spider_process thread has completed its execution.

Ie

def run(self):
    spider_process = Process(target=Scheduler.run_spider)
    spider_process.start()
    spider_process.join()

Here's a link to the official join() documentation: https://docs.python.org/3/library/threading.html#threading.Thread.join

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM