簡體   English   中英

在運行Python 2.7.8的Windows 7上運行Scrapy教程時出錯

[英]Error when running Scrapy Tutorial on Windows 7 running Python 2.7.8

抱歉,這是騙子。 我看了。 按照scrapy安裝說明安裝了python,scrapy,OpenSSL和lxml。 我通讀本教程,直到了解以下內容:

PS C:\python27\tutorial> scrapy crawl dmoz

2014-09-18 15:29:14-0700 [scrapy] INFO: Scrapy 0.24.4 started (bot: tutorial)
2014-09-18 15:29:14-0700 [scrapy] INFO: Optional features available: ssl, http11
2014-09-18 15:29:14-0700 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'tutorial.spiders', 'SPIDER_MODULE
['tutorial.spiders'], 'BOT_NAME': 'tutorial'}
2014-09-18 15:29:14-0700 [scrapy] INFO: Enabled extensions: LogStats, TelnetConsole, CloseSpider, WebService, CoreSt
 SpiderState
Traceback (most recent call last):
  File "C:\Python27\lib\runpy.py", line 162, in _run_module_as_main
    "__main__", fname, loader, pkg_name)
  File "C:\Python27\lib\runpy.py", line 72, in _run_code
    exec code in run_globals
  File "C:\Python27\Scripts\scrapy.exe\__main__.py", line 9, in <module>
  File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 143, in execute
    _run_print_help(parser, _run_command, cmd, args, opts)
  File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 89, in _run_print_help
    func(*a, **kw)
  File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 150, in _run_command
    cmd.run(args, opts)
  File "C:\Python27\lib\site-packages\scrapy\commands\crawl.py", line 60, in run
    self.crawler_process.start()
  File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 92, in start
    if self.start_crawling():
  File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 124, in start_crawling
    return self._start_crawler() is not None
  File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 139, in _start_crawler
    crawler.configure()
  File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 47, in configure
    self.engine = ExecutionEngine(self, self._spider_closed)
  File "C:\Python27\lib\site-packages\scrapy\core\engine.py", line 64, in __init__
    self.downloader = downloader_cls(crawler)
  File "C:\Python27\lib\site-packages\scrapy\core\downloader\__init__.py", line 73, in __init__
    self.handlers = DownloadHandlers(crawler)
  File "C:\Python27\lib\site-packages\scrapy\core\downloader\handlers\__init__.py", line 22, in __init__
    cls = load_object(clspath)
  File "C:\Python27\lib\site-packages\scrapy\utils\misc.py", line 42, in load_object
    raise ImportError("Error loading object '%s': %s" % (path, e))
ImportError: Error loading object 'scrapy.core.downloader.handlers.s3.S3DownloadHandler': No module named win32api

有任何想法嗎?

弄清楚了。

安裝pywin32

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM