简体   繁体   中英

How to pass parameters to pipelines.py in scrapy?

The problem I have is basically a parameter problem. I have a scrapy project with five spiders. Spiders must extract information within a geographic area, which is delimited by user-entered coordinates.

At this point, the area polygon is defined in pipelines.py and is used by all spiders. The question is, how should I pass the coordinates as a parameter from a graph interface (tkinter) to pipelines.py?

`class My_Pipeline(object):
    gmaps_key =   googlemaps.Client(key='some_key')
    count = 0
    insidePolygon = 0
    polygon = [(-30.8377119, -64.6603737),
              (-31.5156532, -64.6699867),
              (-31.5507684, -64.4969520),
              (-31.4898937, -64.3349037),
              (-31.3503690, -64.3150073),
              (-31.3052314, -64.3352658),
              (-31.2659561, -64.3623695),
              (-30.9873456, -64.3994484),
              (-30.8412493, -64.4173012),
              (-30.8117674, -64.4763527),
              (-30.8117674, -64.5642433),
              (-30.8377119, -64.6603737)]`

That polygon is the one that must be replaced by the coordinates entered by the user from the graphic interface.

Pass parameters from command line using scrapy crawl myspider -a params="1,3,5,6"

class MySpider(scrapy.Spider):

    name = 'myspider'

    def __init__(self, 
                 params='can be anything',
                 * args, ** kwargs):

        self.params = params

        super(MySpider, self).__init__()

and then in your pipelines

class My_Pipeline(object):
    spider.params #read if like this

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM