简体   繁体   中英

Calling Scrapy Spider from Django

I have a project with a django and scrapy folder in the same workspace:

my_project/
    django_project/
        django_project/
            settings.py
        app1/
        app2/
        manage.py
        ...
    scrapy_project/
        scrapy_project/
            settings.py
        scrapy.cfg
        ...

I've already connected scrapy with my django app1 model so every time I run my spider, it stores the collected data in my postgresql db. This is how my scrapy project can access to the django model

#in my_project/scrapy_project/scrapy_project/settings.py
import sys
import os
import django

sys.path.append('/../../django_project')
os.environ['DJANGO_SETTINGS_MODULE'] = 'django_project.settings'
django.setup()

Everything works great when I call the spider from the command line, but when I wanted to call the spider as a script from a django view or a Celery task in django, for example:

from scrapy.crawler import CrawlerProcess
from scrapy.utils.project import get_project_settings
process = CrawlerProcess(get_project_settings())
process.crawl('spider_name')
process.start()

I get an Error:

KeyError: 'Spider not found: spider_name'

I think I'm suppose to tell Django where is Scrapy located (as I've done in scrapy settings), but I don't know how. To be honest, I'm not even sure that how I design my folder structure for this project is the correct choice.

follow example from scrapy doc :

from my_projec.scrapy_project.spiders import MySpider
...
process.crawl(MySpider)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM