简体   繁体   中英

Using Celery, execute a list of URLs in a single celery task. Is it Possible?

views.py

urls=["https//:.....com,https//:.....com,etc.."]
for i in urls:
   r=process.delay(i)

When I'm calling the celery task it execute separate task. How to execute the set of lists in a single celery task?

tasks.py

@app.task
def process(url):
    r = requests.get(url, allow_redirects=True)
    return r

You can pass a group of URLs or even the whole list to the celery task. But this doesn't make much sense as we haven't actually taken advantage of the parallelized outgoing requests since each URL would be accessed serially, one after the other.

views.py

urls=["https//:.....com", "https//:.....com", ...]
r = process.delay(urls)

tasks.py

@app.task
def process(urls):
    results = []
    for url in urls:
        r = requests.get(url, allow_redirects=True)
        results.append(r)
    return results

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM