简体   繁体   中英

AttributeError: 'Spider' object has no attribute 'crawler' in scarpy spider

In order to access settings from _init_ i had to add from_crawler @classmethod . Now it appears that some functionality of scrapy framework was lost. i am getting AttributeError: 'Code1Spider' object has no attribute 'crawler' when url fails and spider tries to retry request. Scrapy version is 2.0.1. Spider is running on Zyte cloud.

What did i do wrong and how to fix it?

Here is my spider code:

class Code1Spider(scrapy.Spider):
    name = 'cointelegraph_pr'
    allowed_domains = ['cointelegraph.com']
    start_urls = ['https://cointelegraph.com/press-releases']
    
    
    def __init__(self, settings):
        #Returns settings values as dict
        settings=settings.copy_to_dict()
        
        self.id = int(str(datetime.now().timestamp()).split('.')[0])          
        self.gs_id = settings.get('GS_ID')
        self.endpoint_url = settings.get('ENDPOINT_URL')
        self.zyte_api_key = settings.get('ZYTE_API_KEY')        
        self.zyte_project_id = settings.get('ZYTE_PROJECT_ID')        
        self.zyte_collection_name = self.name
        
        #Loads a list of stop words from predefined google sheet
        self.denied = load_gsheet(self.gs_id)
        #Loads all scraped urls from previous runs from zyte collections
        self.scraped_urls = load_from_collection(self.zyte_project_id, self.zyte_collection_name, self.zyte_api_key)
        logging.info("###############################")
        logging.info("Number of previously scraped URLs = {}.".format(len(self.scraped_urls)))
        logging.info("")
        
        
    # We need this to pass settings into init. Otherwise settings will be accessible only after init function.
    # As per https://docs.scrapy.org/en/1.8/topics/settings.html#how-to-access-settings
    @classmethod
    def from_crawler(cls, crawler):
        settings = crawler.settings
        return cls(settings)

Here is the error:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/twisted/internet/defer.py", line 1418, in _inlineCallbacks
    result = g.send(result)
  File "/usr/local/lib/python3.8/site-packages/scrapy/core/downloader/middleware.py", line 42, in process_request
    defer.returnValue((yield download_func(request=request, spider=spider)))
  File "/usr/local/lib/python3.8/site-packages/twisted/internet/defer.py", line 1362, in returnValue
    raise _DefGen_Return(val)
twisted.internet.defer._DefGen_Return: <504 https://cointelegraph.com/press-releases/the-launch-of-santa-browser-to-bring-in-the-next-200m-users-onto-web30>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/twisted/internet/defer.py", line 1418, in _inlineCallbacks
    result = g.send(result)
  File "/usr/local/lib/python3.8/site-packages/scrapy/core/downloader/middleware.py", line 51, in process_response
    response = yield deferred_from_coro(method(request=request, response=response, spider=spider))
  File "/usr/local/lib/python3.8/site-packages/scrapy/downloadermiddlewares/retry.py", line 53, in process_response
    return self._retry(request, reason, spider) or response
  File "/usr/local/lib/python3.8/site-packages/scrapy/downloadermiddlewares/retry.py", line 69, in _retry
    stats = spider.crawler.stats
AttributeError: 'Code1Spider' object has no attribute 'crawler'

Everything else is scrapy default spider. No modifications to settings or middleware. What did i do wrong and how to fix it?

That is because you are overwriting the from_crawler method without assigning the crawler to the spider.

change your from_crawler method to the following:

@classmethod
def from_crawler(cls, crawler):
    settings = crawler.settings
    spider = cls(settings)
    spider._set_crawler(crawler)
    return spider

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM