简体   繁体   中英

Celery task is hanging with http request

I'm testing celery tasks and have stumbled on issue. If in task exists code with request(through urllib.urlopen) then it's hanging. What reasons can be?

I just try start on minimal config with Flask. I used rabbitmq and redis for broker and backend, but result is the same.

file(run_celery.py) with tasks:

...import celery and flask app...

celery = Celery(
    app.import_name,
    backend=app.config['CELERY_BROKER_URL'],
    broker=app.config['CELERY_BROKER_URL']
)

@celery.task
def test_task(a):
    print(a)
    print(requests.get('http://google.com'))

In this way I launched worker: celery -A run_celery.celery worker -l debug

After this, I run ipython and call task.

from run_celery import test_task
test_task.apply_async(('sfas',))

Worker's beginning perform task:

...
Received task: run_celery.test_task...
sfas
Starting new HTTP connection (1)...

And after this it's hanging.

This behavior is actual only if task contain request. What Did I do wrong?

I found reason in my code and very wondered O_o. I don't know why this is happening but within file with tasks, exists import Model and when it is executing then perform initialization instance MagentoAPI( https://github.com/bernieke/python-magento ). If I comment out this initialization then requests in celery tasks perform correctly.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM