简体   繁体   English

在 Scrapy python 中将参数传递给 process.crawl

[英]Passing arguments to process.crawl in Scrapy python

I would like to get the same result as this command line: scrapy crawl linkedin_anonymous -a first=James -a last=Bond -o output.json我想得到与此命令行相同的结果:scrapy crawl linkedin_anonymous -a first=James -a last=Bond -o output.json

My script is as follows:我的脚本如下:

import scrapy
from linkedin_anonymous_spider import LinkedInAnonymousSpider
from scrapy.crawler import CrawlerProcess
from scrapy.utils.project import get_project_settings

spider = LinkedInAnonymousSpider(None, "James", "Bond")
process = CrawlerProcess(get_project_settings())
process.crawl(spider) ## <-------------- (1)
process.start()

I found out that process.crawl() in (1) is creating another LinkedInAnonymousSpider where first and last are None (printed in (2)), if so, then there is no point of creating the object spider and how is it possible to pass the arguments first and last to process.crawl()?我发现 (1) 中的 process.crawl() 正在创建另一个 LinkedInAnonymousSpider,其中 first 和 last 都为 None(在 (2) 中打印),如果是这样,那么创建对象蜘蛛就没有意义了,怎么可能将参数首先和最后传递给 process.crawl()?

linkedin_anonymous: linkedin_anonymous:

from logging import INFO

import scrapy

class LinkedInAnonymousSpider(scrapy.Spider):
    name = "linkedin_anonymous"
    allowed_domains = ["linkedin.com"]
    start_urls = []

    base_url = "https://www.linkedin.com/pub/dir/?first=%s&last=%s&search=Search"

    def __init__(self, input = None, first= None, last=None):
        self.input = input  # source file name
        self.first = first
        self.last = last

    def start_requests(self):
        print self.first ## <------------- (2)
        if self.first and self.last: # taking input from command line parameters
                url = self.base_url % (self.first, self.last)
                yield self.make_requests_from_url(url)

    def parse(self, response): . . .

pass the spider arguments on the process.crawl method:process.crawl方法上传递蜘蛛参数:

process.crawl(spider, input='inputargument', first='James', last='Bond')

You can do it the easy way:你可以用简单的方法做到这一点:

from scrapy import cmdline

cmdline.execute("scrapy crawl linkedin_anonymous -a first=James -a last=Bond -o output.json".split())

if you have Scrapyd and you want to schedule the spider, do this如果你有 Scrapyd 并且你想安排蜘蛛,这样做

curl http://localhost:6800/schedule.json -d project=projectname -d spider=spidername -d first='James' -d last='Bond'

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM