簡體   English   中英

cra殼不起作用

[英]Scrapy shell doesn't work

我是scrapy的新手,然后我想嘗試scrapy shell進行調試和學習,但是奇怪的是shell命令根本不起作用。

  • 似乎該網站已成功爬網,但沒有輸出更多內容。 該程序正在等待執行,似乎已死,我必須使用ctrl-c結束它。

您能幫忙找出問題所在嗎?

我正在使用Anaconda + scrapy 1.0.3

$ ping 135.251.157.2

Pinging 135.251.157.2 with 32 bytes of data:
Reply from 135.251.157.2: bytes=32 time=13ms TTL=56
Reply from 135.251.157.2: bytes=32 time=14ms TTL=56
Reply from 135.251.157.2: bytes=32 time=14ms TTL=56
Reply from 135.251.157.2: bytes=32 time=14ms TTL=56

Ping statistics for 135.251.157.2:
    Packets: Sent = 4, Received = 4, Lost = 0 (0% loss),
Approximate round trip times in milli-seconds:
    Minimum = 13ms, Maximum = 14ms, Average = 13ms


$ scrapy shell "http://135.251.157.2/"
2016-01-28 21:35:18 [scrapy] INFO: Scrapy 1.0.3 started (bot: demo)
2016-01-28 21:35:18 [scrapy] INFO: Optional features available: ssl, http11, boto
2016-01-28 21:35:18 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'demo.spiders', 'SPIDER_MODULES': ['demo.spiders'], 'LOGSTATS_INTERVAL': 0, 'BOT_NAME': 'demo'}
2016-01-28 21:35:18 [scrapy] INFO: Enabled extensions: CloseSpider, TelnetConsole, CoreStats, SpiderState
2016-01-28 21:35:19 [scrapy] INFO: Enabled downloader middlewares: HttpAuthMiddleware, DownloadTimeoutMiddleware, UserAgentMiddleware, RetryMiddleware, DefaultHeadersMiddleware, MetaRefreshMiddleware, HttpCompressionMiddleware, RedirectMiddleware, CookiesMiddleware, HttpProxyMiddleware, ChunkedTransferMiddleware, DownloaderStats
2016-01-28 21:35:19 [scrapy] INFO: Enabled spider middlewares: HttpErrorMiddleware, OffsiteMiddleware, RefererMiddleware, UrlLengthMiddleware, DepthMiddleware
2016-01-28 21:35:19 [scrapy] INFO: Enabled item pipelines:
2016-01-28 21:35:19 [scrapy] DEBUG: Telnet console listening on 127.0.0.1:6023
2016-01-28 21:35:19 [scrapy] INFO: Spider opened
2016-01-28 21:35:24 [scrapy] DEBUG: Crawled (200) <GET http://135.251.157.2/> (referer: None)
2016-01-28 21:35:24 [root] DEBUG: Using default logger
2016-01-28 21:35:24 [root] DEBUG: Using default logger
ctrl-c

$ 

我想關閉此線程,因為我發現根本原因與不同的終端有關。 當我使用Git Bash時,它不起作用,但是如果我使用Anaconda Prompt,則效果很好。

[Anaconda2] D:\SVN\tools\Spider\demo>scrapy shell "http://135.251.157.2/"
2016-01-29 13:40:33 [scrapy] INFO: Scrapy 1.0.3 started (bot: demo)
2016-01-29 13:40:33 [scrapy] INFO: Optional features available: ssl, http11, boto
2016-01-29 13:40:33 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'demo.spiders', 'SPIDER_MODULES': ['demo.spiders'], 'LOGSTATS_INTERVAL': 0, 'BOT_NAME': 'demo'}
2016-01-29 13:40:33 [scrapy] INFO: Enabled extensions: CloseSpider, TelnetConsole, CoreStats, SpiderState
2016-01-29 13:40:33 [scrapy] INFO: Enabled downloader middlewares: HttpAuthMiddleware, DownloadTimeoutMiddleware, UserAgentMiddleware, RetryMiddleware, DefaultHeadersMiddleware, MetaRefreshMiddleware, HttpCompressionMiddleware, RedirectMiddleware, CookiesMiddleware, HttpProxyMiddleware, ChunkedTrans
ferMiddleware, DownloaderStats
2016-01-29 13:40:33 [scrapy] INFO: Enabled spider middlewares: HttpErrorMiddleware, OffsiteMiddleware, RefererMiddleware, UrlLengthMiddleware, DepthMiddleware
2016-01-29 13:40:33 [scrapy] INFO: Enabled item pipelines:
2016-01-29 13:40:33 [scrapy] DEBUG: Telnet console listening on 127.0.0.1:6023
2016-01-29 13:40:33 [scrapy] INFO: Spider opened
2016-01-29 13:40:41 [scrapy] DEBUG: Crawled (200) <GET http://135.251.157.2/> (referer: None)
[s] Available Scrapy objects:
[s]   crawler    <scrapy.crawler.Crawler object at 0x0136B290>
[s]   item       {}
[s]   request    <GET http://135.251.157.2/>
[s]   response   <200 http://135.251.157.2/>
[s]   settings   <scrapy.settings.Settings object at 0x034204B0>
[s]   spider     <DefaultSpider 'default' at 0x3e3c6d0>
[s] Useful shortcuts:
[s]   shelp()           Shell help (print this help)
[s]   fetch(req_or_url) Fetch request (or URL) and update local objects
[s]   view(response)    View response in a browser
2016-01-29 13:40:41 [root] DEBUG: Using default logger
2016-01-29 13:40:41 [root] DEBUG: Using default logger

In [1]:

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM