简体   繁体   中英

Scrapy tutorial error: “Running 'Scrapy Crawl' with more than one spider is no longer supported”

Windows 10, Command Line Terminal, Python 3.6.3 |Anaconda, Inc.| (default, Oct 15 2017, 03:27:45) [MSC v.1900 64 bit (AMD64)] on win32

New to Python & following tutorial instructions here: https://docs.scrapy.org/en/latest/intro/tutorial.html

On the step where I run my first crawl, I get the following error...I've only created one file quotes_spider.py as the tutorial suggests. I've also tried changing the directory to c:\\Anaconda\\projects\\tutorial\\startproject and c:\\Anaconda\\projects\\tutorial\\startproject\\spiders but get the same error.

c:\Anaconda\projects\tutorial>c:\Anaconda\Scripts\scrapy.exe -scrapy crawl quotes  

2017-11-09 16:08:26 [scrapy.utils.log] INFO: Scrapy 1.4.0 started (bot: startproject)
2017-11-09 16:08:26 [scrapy.utils.log] INFO: Overridden settings: {'BOT_NAME': 'startproject', 'NEWSPIDER_MODULE': 'startproject.spiders', 'ROBOTSTXT_OBEY': True, 'SPIDER_MODULES': ['startproject.spiders']}
Usage
=====
  scrapy crawl [options] <spider>

scrapy: error: running 'scrapy crawl' with more than one spider is no longer supported

What's wrong

....scrapy.exe -scrapy crawl quotes

What it should be

....scrapy.exe crawl quotes

Why

This is scrapy's all available commands, there is no -scrapy option: 在此处输入图片说明

Suggestions

Scrapy Official Tutorial PDF Version

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM