简体   繁体   中英

How to run and view celery task in Django?

I am working on an email crawler for learning purposes and I am having trouble to understand how to run the task and view what task is still running on background.

in my views.py:

def home(request):
form = SignUpForm(request.POST or None)
if form.is_valid():
    save_it= form.save(commit=False)
    save_it.save()
    messages.success(request,"Working, please wait........")
    baseurl=form.cleaned_data['site']
    maxemails=form.cleaned_data['max_emails']
    maxurl=form.cleaned_data['max_links']
    startcraw.delay(baseurl,maxurl,maxemails)
    return HttpResponseRedirect('/done/')
#form not valid
return render_to_response("signup.html",locals(),context_instance=RequestContext(request))

in tasks.py i have :

from celery import task
from .craw import crawler
@task()
def startcraw(base,url,emails):
    f = open('myfile','w')
    f.write('hi there\n') # python will convert \n to os.linesep
    f.close()
    list= crawler(base,url,emails)
    list.save()

I tried to debug with the write to file line

How can i know if the crawler is even running and how can I save/pull the result to my database (SQLite) any help will be appreciate

You need to use Celery Logger. Simple example:

from celery.utils.log import get_task_logger

logger = get_task_logger(__name__)

@app.task
def add(x, y):
    logger.info('started adding function at time {0}'.format(datetime.now()))
    return x + y

It is described here: http://docs.celeryproject.org/en/latest/userguide/tasks.html#logging Actually Celery has good docs so all info could be found there. Also I think that you need scheduled tasks for such objectives like crawling. http://celery.readthedocs.org/en/latest/userguide/periodic-tasks.html

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM