简体   繁体   中英

Celery task calling itself after task succeeds without celerybeat

I want to call my celery task occasionally like every 30minutes after the current task done, But sometimes task takes longer than expected because of task is downloading files from remote server. So I don't want to use celeryBeat. Also, using self. retry is only for when the error occurred I suppose. Here is my task:

@shared_task(name="download_big", bind=True, acks_late=true, autoretry_for=(Exception, requests.exceptiosn.RequestException), retry_kwargs={"max_retries": 4, "countdown": 3}):
def download_big(self):
    my_file = session.get('example.com/hello.mp4')
    if my_file.status_code == requests.codes["OK"]:
        open("hello.mp4", "wb").write(my_file.content)
    else:
        self.retry()

Update:

Well i changed my structure to this:

@shared_task(name="download_big", bind=True, acks_late=true, autoretry_for=(Exception, requests.exceptiosn.RequestException), retry_kwargs={"max_retries": 4, "countdown": 3}):
def download_big(url):
    my_file = session.get(url, name)
    if my_file.status_code == requests.codes["OK"]:
        open(name, "wb").write(my_file.content)
    else:
        self.retry()

@shared_task(name="download_all", bind=True, acks_late=true, autoretry_for=(Exception, requests.exceptiosn.RequestException), retry_kwargs={"max_retries": 4, "countdown": 3}):
def download_all(self):
    my_list = [...]  # bunch of urls with names
    jobs = []
    for name, url in my_list:
        jobs.append(download_big.si(url, name))
    group(jobs)()

So in this case, i have to call download_all method instead of download_big, that way i can download file in parallel and when all the group tasks done it needs to be call itself again after 30 minutes.

You could try using a chord which will run a group of tasks and when they complete, will run a callback which you can use to reschedule.

eg

from celery import chord

@shared_task(name="download_big", bind=True, acks_late=true, autoretry_for=(Exception, requests.exceptiosn.RequestException), retry_kwargs={"max_retries": 4, "countdown": 3}):
def download_big(url):
    my_file = session.get(url, name)
    if my_file.status_code == requests.codes["OK"]:
        open(name, "wb").write(my_file.content)
    else:
        self.retry()

@shared_task(name="download_all", bind=True, acks_late=true, autoretry_for=(Exception, requests.exceptiosn.RequestException), retry_kwargs={"max_retries": 4, "countdown": 3}):
def download_all(self):
    my_list = [...]  # bunch of urls with names
    jobs = []
    for name, url in my_list:
        jobs.append(download_big.si(url, name))

    # Run the group and reschedule once all tasks complete
    chord(jobs)(download_all.apply_async(countdown=1800))

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM