簡體   English   中英

任務成功執行后,芹菜任務自動調用而沒有celerybeat

[英]Celery task calling itself after task succeeds without celerybeat

我想每當完成當前任務后每30分鍾調用一次celery任務,但是有時任務花費的時間比預期的要長,因為任務是從遠程服務器下載文件。 所以我不想使用celeryBeat。 另外,使用自我。 重試僅適用於我想是發生錯誤的時間。 這是我的任務:

@shared_task(name="download_big", bind=True, acks_late=true, autoretry_for=(Exception, requests.exceptiosn.RequestException), retry_kwargs={"max_retries": 4, "countdown": 3}):
def download_big(self):
    my_file = session.get('example.com/hello.mp4')
    if my_file.status_code == requests.codes["OK"]:
        open("hello.mp4", "wb").write(my_file.content)
    else:
        self.retry()

更新:

好吧,我將結構更改為:

@shared_task(name="download_big", bind=True, acks_late=true, autoretry_for=(Exception, requests.exceptiosn.RequestException), retry_kwargs={"max_retries": 4, "countdown": 3}):
def download_big(url):
    my_file = session.get(url, name)
    if my_file.status_code == requests.codes["OK"]:
        open(name, "wb").write(my_file.content)
    else:
        self.retry()

@shared_task(name="download_all", bind=True, acks_late=true, autoretry_for=(Exception, requests.exceptiosn.RequestException), retry_kwargs={"max_retries": 4, "countdown": 3}):
def download_all(self):
    my_list = [...]  # bunch of urls with names
    jobs = []
    for name, url in my_list:
        jobs.append(download_big.si(url, name))
    group(jobs)()

因此,在這種情況下,我必須調用download_all方法而不是download_big,這樣我可以並行下載文件,並且當所有小組任務完成后,都需要在30分鍾后再次調用自身。

您可以嘗試使用和弦 ,該和弦將運行一組任務,完成后將運行可用於重新安排時間的回調。

例如

from celery import chord

@shared_task(name="download_big", bind=True, acks_late=true, autoretry_for=(Exception, requests.exceptiosn.RequestException), retry_kwargs={"max_retries": 4, "countdown": 3}):
def download_big(url):
    my_file = session.get(url, name)
    if my_file.status_code == requests.codes["OK"]:
        open(name, "wb").write(my_file.content)
    else:
        self.retry()

@shared_task(name="download_all", bind=True, acks_late=true, autoretry_for=(Exception, requests.exceptiosn.RequestException), retry_kwargs={"max_retries": 4, "countdown": 3}):
def download_all(self):
    my_list = [...]  # bunch of urls with names
    jobs = []
    for name, url in my_list:
        jobs.append(download_big.si(url, name))

    # Run the group and reschedule once all tasks complete
    chord(jobs)(download_all.apply_async(countdown=1800))

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM