簡體   English   中英

刮y的新手:教程。 運行scrapy crawl dmoz時出現錯誤

[英]scrapy newbie: tutorial. there is an error when running scrapy crawl dmoz

我是python的新手。 我在64位Windows 7上運行64位python 2.7.2版本。我按照該教程進行操作,並在我的計算機上安裝了scrapy。 然后我創建了一個項目demoz。 但是,當我輸入scrapy crawl demoz時,它會顯示錯誤。

d:\Scrapy workspace\tutorial>scrapy crawl dmoz
2013-08-29 16:10:45+0800 [scrapy] INFO: Scrapy 0.18.1 started (bot: tutorial)
2013-08-29 16:10:45+0800 [scrapy] DEBUG: Optional features available: ssl, http1
1
2013-08-29 16:10:45+0800 [scrapy] DEBUG: Overridden settings: {'NEWSPIDER_MODULE
': 'tutorial.spiders', 'SPIDER_MODULES': ['tutorial.spiders'], 'BOT_NAME': 'tuto
rial'}
2013-08-29 16:10:45+0800 [scrapy] DEBUG: Enabled extensions: LogStats, TelnetCon
sole, CloseSpider, WebService, CoreStats, SpiderState
Traceback (most recent call last):
File "C:\Python27\lib\runpy.py", line 162, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "C:\Python27\lib\runpy.py", line 72, in _run_code
exec code in run_globals
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py"
, line 168, in <module>
execute()
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py"
, line 143, in execute
_run_print_help(parser, _run_command, cmd, args, opts)
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py"
, line 88, in _run_print_help
func(*a, **kw)
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py"
, line 150, in _run_command
cmd.run(args, opts)
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\commands\cr
awl.py", line 46, in run
spider = self.crawler.spiders.create(spname, **opts.spargs)
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\command.py"
, line 34, in crawler
self._crawler.configure()
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\crawler.py"
, line 44, in configure
self.engine = ExecutionEngine(self, self._spider_closed)  
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\core\engine
.py", line 61, in __init__
self.scheduler_cls = load_object(self.settings['SCHEDULER']) 
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\utils\misc.
py", line 40, in load_object
raise ImportError, "Error loading object '%s': %s" % (path, e)
ImportError: Error loading object 'scrapy.core.scheduler.Scheduler': No module n
amed queuelib'

我想他們的安裝有問題。任何人都可以幫忙..在此先感謝..

您能否在創建的“ demoz”或“ dmoz”項目中驗證Spider的名稱?

您在命令中將“ dmoz”指定為蜘蛛名稱

d:\Scrapy workspace\tutorial>scrapy crawl dmoz

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM