[英]How To Turn Off Logging in Scrapy (Python)
I have created a spider using Scrapy but I cannot figure out how to turn off the default logging.我使用 Scrapy 创建了一个蜘蛛,但我不知道如何关闭默认日志记录。 From the documentation it appears that I should be able to turn it off by doing从文档看来,我应该可以通过这样做来关闭它
logging.basicConfig(level=logging.ERROR)
But this has no effect.但这没有效果。 From looking at the code for logging.basicConfig() I'm guessing this is because "the root logger has handlers configured" but perhaps I'm wrong about that.通过查看 logging.basicConfig() 的代码,我猜这是因为“根记录器配置了处理程序”,但也许我错了。 At any rate, can anyone explain what I need to do to get Scrapy to not output the usual无论如何,谁能解释我需要做什么才能让 Scrapy 不是 output 通常
2015-10-18 17:42:00 [scrapy] INFO: Scrapy 1.0.3 started (bot: EF)
2015-10-18 17:42:00 [scrapy] INFO: Scrapy 1.0.3 started (bot: EF)
2015-10-18 17:42:00 [scrapy] INFO: Optional features available: ssl, http11, boto
2015-10-18 17:42:00 [scrapy] INFO: Optional features available: ssl, http11, boto
etc.? ETC。?
EDIT: As suggested by sirfz below, the line编辑:正如下面sirfz所建议的那样,该行
logging.getLogger('scrapy').setLevel(logging.WARNING)
can be used to set the logging level.可用于设置日志记录级别。 However, it appears that you must do this in the init method (or later) in your spider.但是,您似乎必须在蜘蛛的 init 方法(或更高版本)中执行此操作。
You can simply change the logging level for scrapy (or any other logger): 您只需更改scrapy(或任何其他记录器)的日志记录级别:
logging.getLogger('scrapy').setLevel(logging.WARNING)
This disables all log messages less than the WARNING
level. 这将禁用低于WARNING
级别的所有日志消息。
To disable all scrapy log messages you can just set propagate
to False
: 要禁用所有scrapy日志消息,只需将propagate
设置为False
:
logging.getLogger('scrapy').propagate = False
This prevents scrapy's log messages from propagating to the root logger (which prints to console when configured using basicConfig()
) 这可以防止scrapy的日志消息传播到根记录器(使用basicConfig()
配置时打印到控制台)
You could add -s LOG_ENABLED=False
as a parameter when launching your script. 启动脚本时,可以添加-s LOG_ENABLED=False
作为参数。 That should do the trick. 这应该够了吧。
Note : For the version 1.1 changed a little bit: -s LOG_ENABLED=0
注意 :对于版本1.1改变了一点: -s LOG_ENABLED=0
logging.basicConfig(**kwargs) logging.basicConfig(** kwargs)
This function does nothing if the root logger already has handlers configured for it. 如果根记录器已经为其配置了处理程序,则此函数不执行任何操作。
Scrapy has handlers configured for it, so this will not work Scrapy已经为它配置了处理程序,因此这不起作用
you can simply add --nolog
as a parameter when launching your spider using scrapy
command 您可以使用scrapy
命令在启动蜘蛛时简单地添加--nolog
作为参数
I am using scrapy v1.7.3. 我正在使用scrapy v1.7.3。 you can see more in help using command: 你可以使用命令在help中看到更多:
scrapy --help
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.