繁体   English   中英

刮y的新手:教程。 运行scrapy crawl dmoz时出现错误

[英]scrapy newbie: tutorial. there is an error when running scrapy crawl dmoz

我是python的新手。 我在64位Windows 7上运行64位python 2.7.2版本。我按照该教程进行操作,并在我的计算机上安装了scrapy。 然后我创建了一个项目demoz。 但是,当我输入scrapy crawl demoz时,它会显示错误。

d:\Scrapy workspace\tutorial>scrapy crawl dmoz
2013-08-29 16:10:45+0800 [scrapy] INFO: Scrapy 0.18.1 started (bot: tutorial)
2013-08-29 16:10:45+0800 [scrapy] DEBUG: Optional features available: ssl, http1
1
2013-08-29 16:10:45+0800 [scrapy] DEBUG: Overridden settings: {'NEWSPIDER_MODULE
': 'tutorial.spiders', 'SPIDER_MODULES': ['tutorial.spiders'], 'BOT_NAME': 'tuto
rial'}
2013-08-29 16:10:45+0800 [scrapy] DEBUG: Enabled extensions: LogStats, TelnetCon
sole, CloseSpider, WebService, CoreStats, SpiderState
Traceback (most recent call last):
File "C:\Python27\lib\runpy.py", line 162, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "C:\Python27\lib\runpy.py", line 72, in _run_code
exec code in run_globals
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py"
, line 168, in <module>
execute()
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py"
, line 143, in execute
_run_print_help(parser, _run_command, cmd, args, opts)
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py"
, line 88, in _run_print_help
func(*a, **kw)
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py"
, line 150, in _run_command
cmd.run(args, opts)
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\commands\cr
awl.py", line 46, in run
spider = self.crawler.spiders.create(spname, **opts.spargs)
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\command.py"
, line 34, in crawler
self._crawler.configure()
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\crawler.py"
, line 44, in configure
self.engine = ExecutionEngine(self, self._spider_closed)  
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\core\engine
.py", line 61, in __init__
self.scheduler_cls = load_object(self.settings['SCHEDULER']) 
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\utils\misc.
py", line 40, in load_object
raise ImportError, "Error loading object '%s': %s" % (path, e)
ImportError: Error loading object 'scrapy.core.scheduler.Scheduler': No module n
amed queuelib'

我想他们的安装有问题。任何人都可以帮忙..在此先感谢..

您能否在创建的“ demoz”或“ dmoz”项目中验证Spider的名称?

您在命令中将“ dmoz”指定为蜘蛛名称

d:\Scrapy workspace\tutorial>scrapy crawl dmoz

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM