[英]Django and Celery: scheduled task is running, but doesn't seem to do anything
I have created a Celery task in my Django app to scrape data and save it to database:我在我的 Django 应用程序中创建了一个 Celery 任务来抓取数据并将其保存到数据库中:
#tasks.py
from celery.task.schedules import crontab
from celery.decorators import periodic_task
from celery.utils.log import get_task_logger
import datetime
from .models import Listing, City #, ScrapingDate
from .scrapers import AIRBNB_scraper_from_jason
logger = get_task_logger(__name__)
@periodic_task(
run_every=(crontab(minute='*/1')),
name="scrape_capitals_listings",
ignore_result=True
)
def scrape_capitals_listings():
default_checkin = datetime.date.today()
default_checkout = default_checkin + datetime.timedelta(days=30)
counter = 0
capitals = City.objects.filter(status='capital')
for capital in capitals:
print('Scraping listings for ' + capital.name)
capitals_scraper = AIRBNB_scraper_from_jason(city=capital.name, checkin=default_checkin, checkout=default_checkout)
capitals_listings = capitals_scraper.scraped_data
for capital_listing in capitals_listings:
listing = Listing()
listing.name = capital_listing[0]
listing.link = capital_listing[1]
listing.price = capital_listing[2]
listing.city = capital
listing.country = capital.country
listing.continent = capital.continent
listing.date_added = datetime.datetime.now()
listing.save()
counter += 1
if counter == 3:
break
In my views.py I call the task like this:在我的 views.py 中,我像这样调用任务:
from .tasks import scrape_capitals_listings
def dashboards(request):
scrape_capitals_listings.delay()# I have also tried just "scrape_capitals_listings"
capitals_chart = create_capitals_chart()
context = {'capitals_chart':capitals_chart}
return render(request, 'javascript/dashboards.html', context)
The I run the server, Redis broker, celery -A test_project -l info
and celery -A test_project beat -l info
in separate terminals.我在不同的终端中运行服务器、Redis 代理、 celery -A test_project -l info
和celery -A test_project beat -l info
。
I can see that the scheduled task is being sent by Celery like this [2018-08-17 07:16:00,018: INFO/MainProcess] Scheduler: Sending due task scrape_capitals_listings (scrape_capitals_listings)
, but nothing gets saved to the database.我可以看到 Celery 正在像这样发送计划任务[2018-08-17 07:16:00,018: INFO/MainProcess] Scheduler: Sending due task scrape_capitals_listings (scrape_capitals_listings)
,但没有任何内容保存到数据库中。
The function code is working fine when called from within view normally, not as a Celery task, so I guess I am missing some important step here?当从视图中正常调用时,函数代码工作正常,而不是作为 Celery 任务,所以我想我在这里遗漏了一些重要的步骤?
If you want to use periodic tasks you need to start the celerybeat
service.如果要使用周期性任务,则需要启动celerybeat
服务。
$ celerybeat
or或者
$ celeryd -B
http://docs.celeryproject.org/en/2.0-archived/getting-started/periodic-tasks.html#starting-celerybeat http://docs.celeryproject.org/en/2.0-archived/getting-started/periodic-tasks.html#starting-celerybeat
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.