简体   繁体   中英

Python Scrapy - Execute code after spider exits

I am not able to find an answer for that question. How can I execute a python code after a scrapy spider exits:

I did the following inside the function which parses the response (def parse_item(self, response):) : self.my_function() Than I defined my_function(), but the problem is that it is still inside the loop of the spider. My main idea is to execute a given code in a function outside the spider's loop with the gathered data. Thanks.

Use the function closed of the Scrapy class as follows:

class MySpider(scrapy.Spider):
    # some attributes
    spider_attr = []

    def parse(self, response):
        # do your logic here
        # page_text = response.xpath('//text()').extract()
        self.spider_attr.append(whatever)

    def closed(self, reason):
        # will be called when the crawler process ends
        # any code 
        # do something with collected data 
        for i in self.spider_attr: 
            print(i)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM