簡體   English   中英

Python 2.7線程隊列,重用線程

[英]Python 2.7 threading queue, reuse threads

我正在尋找一種方法,使后台線程隊列運行無限次。 下面的代碼是我從研究中得出的,但僅限於創建的線程數量。 在我的研究中,我一直無法找出一種方法來始終保持1個或2個線程始終可用,並在將其添加到隊列時運行set函數。

該示例應用程序的目標是每10秒檢查站點的狀態,如果響應代碼不是200,請運行notify函數中的代碼。

現在發生的是,代碼正常工作,直到達到創建的線程限制為止(在這種情況下為5)。 主while循環保持正常工作,但是由於沒有更多線程,因此發生故障時需要執行的代碼將停止。

import urllib2, time
from threading import Thread
from Queue import Queue

# Set up global variables
num_threads = 5
queue = Queue()
urlList = [
    "http://google.com",
    "http://googleeeeeee1111.com"
]


def notify(i, q):
    print "Thread %s: started" % i
    url = q.get()
    print "Thread %s: notification sent for site: %s" % (i, url)
    q.task_done()


def make_requests():
    while True:
        for url in urlList:
            try:
                request = urllib2.urlopen(url)
                responseCode = request.getcode()

                # If the response code was 200, do something
                if responseCode == 200:
                    print "URL: %s  -  Success %d" % (url, responseCode)
                else:
                    print "Bad response code for %s  -  %d " % (url, responseCode)
                    queue.put(url)

            except Exception, e:
                print "ERROR MAKING REQUEST TO %s - %s" % (url, e)
                queue.put(url)
        time.sleep(10)  # wait 10 seconds and start again


if __name__ == '__main__':
    # Set up some threads to fetch the enclosures
    for i in range(num_threads):
        worker = Thread(target=notify, args=(i, queue, ))
        worker.setDaemon(True)
        worker.start()

    make_requests()

在開始之前,這里是Python GIL的文檔以及它如何影響線程。

我不確定這是否是您要尋找的東西,但是您可以將notify包裝到一個永無止境的循環中。 我對您的代碼進行了修改,使其具有類似的功能,以及一些與功能無關的小改正:

import urllib2, time
from threading import Thread
from Queue import Queue

# Set up global variables
num_threads = 3 #you can set it to any number of threads (although 1 would be enough)

queue = Queue()
url_list = [
    "http://google.com",
    "http://googleeeeeee1111.com"
]


def notify(i, q):
    url = q.get()
    print "Thread %d: notification sent for site: %s" % (i, url)
    q.task_done()


def thread_func(i, q):
    print "Thread %d: started" % i
    while True:
        notify(i, q)
    print "Thread %d: ending" % i

def make_requests():
    while True:
        for url in url_list:
            try:
                request = urllib2.urlopen(url)
                response_code = request.getcode()

                # If the response code was 200, do something
                if response_code == 200:
                    print "URL: %s  -  Success %d" % (url, response_code)
                else:
                    print "Bad response code for %s  -  %d " % (url, response_code)
                    queue.put(url)

            except Exception as e:
                print "ERROR MAKING REQUEST TO %s - %s" % (url, e)
                queue.put(url)
        time.sleep(10)  # wait 10 seconds and start again


if __name__ == "__main__":
    # Set up some threads to fetch the enclosures
    for i in xrange(num_threads):
        worker = Thread(target=thread_func, args=(i, queue))
        worker.setDaemon(True)
        worker.start()

    make_requests()

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM