简体   繁体   中英

scrapy newbie: tutorial. there is an error when running scrapy crawl dmoz

I am a newbie to python. I am running python 2.7.2 version 64 bit on 64 bit windows 7. I followed the tutorial and installed scrapy on my machine. Then I created one project, demoz. But when I enter scrapy crawl demoz it shows an error.

d:\Scrapy workspace\tutorial>scrapy crawl dmoz
2013-08-29 16:10:45+0800 [scrapy] INFO: Scrapy 0.18.1 started (bot: tutorial)
2013-08-29 16:10:45+0800 [scrapy] DEBUG: Optional features available: ssl, http1
1
2013-08-29 16:10:45+0800 [scrapy] DEBUG: Overridden settings: {'NEWSPIDER_MODULE
': 'tutorial.spiders', 'SPIDER_MODULES': ['tutorial.spiders'], 'BOT_NAME': 'tuto
rial'}
2013-08-29 16:10:45+0800 [scrapy] DEBUG: Enabled extensions: LogStats, TelnetCon
sole, CloseSpider, WebService, CoreStats, SpiderState
Traceback (most recent call last):
File "C:\Python27\lib\runpy.py", line 162, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "C:\Python27\lib\runpy.py", line 72, in _run_code
exec code in run_globals
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py"
, line 168, in <module>
execute()
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py"
, line 143, in execute
_run_print_help(parser, _run_command, cmd, args, opts)
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py"
, line 88, in _run_print_help
func(*a, **kw)
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\cmdline.py"
, line 150, in _run_command
cmd.run(args, opts)
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\commands\cr
awl.py", line 46, in run
spider = self.crawler.spiders.create(spname, **opts.spargs)
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\command.py"
, line 34, in crawler
self._crawler.configure()
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\crawler.py"
, line 44, in configure
self.engine = ExecutionEngine(self, self._spider_closed)  
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\core\engine
.py", line 61, in __init__
self.scheduler_cls = load_object(self.settings['SCHEDULER']) 
File "C:\Python27\lib\site-packages\scrapy-0.18.1-py2.7.egg\scrapy\utils\misc.
py", line 40, in load_object
raise ImportError, "Error loading object '%s': %s" % (path, e)
ImportError: Error loading object 'scrapy.core.scheduler.Scheduler': No module n
amed queuelib'

I guess their is something wrong in installation can anybody help please .. Thanks in advance..

Can you please verify the name of the spider in the project you have created "demoz" or "dmoz" ?

You specified "dmoz" as the spider name in the command

d:\Scrapy workspace\tutorial>scrapy crawl dmoz

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM