简体   繁体   English

如何在 Scrapy (Python) 中关闭日志记录

[英]How To Turn Off Logging in Scrapy (Python)

I have created a spider using Scrapy but I cannot figure out how to turn off the default logging.我使用 Scrapy 创建了一个蜘蛛,但我不知道如何关闭默认日志记录。 From the documentation it appears that I should be able to turn it off by doing文档看来,我应该可以通过这样做来关闭它

        logging.basicConfig(level=logging.ERROR)

But this has no effect.但这没有效果。 From looking at the code for logging.basicConfig() I'm guessing this is because "the root logger has handlers configured" but perhaps I'm wrong about that.通过查看 logging.basicConfig() 的代码,我猜这是因为“根记录器配置了处理程序”,但也许我错了。 At any rate, can anyone explain what I need to do to get Scrapy to not output the usual无论如何,谁能解释我需要做什么才能让 Scrapy 不是 output 通常

        2015-10-18 17:42:00 [scrapy] INFO: Scrapy 1.0.3 started (bot: EF)
        2015-10-18 17:42:00 [scrapy] INFO: Scrapy 1.0.3 started (bot: EF)
        2015-10-18 17:42:00 [scrapy] INFO: Optional features available: ssl, http11, boto
        2015-10-18 17:42:00 [scrapy] INFO: Optional features available: ssl, http11, boto

etc.? ETC。?

EDIT: As suggested by sirfz below, the line编辑:正如下面sirfz所建议的那样,该行

        logging.getLogger('scrapy').setLevel(logging.WARNING)

can be used to set the logging level.可用于设置日志记录级别。 However, it appears that you must do this in the init method (or later) in your spider.但是,您似乎必须在蜘蛛的 init 方法(或更高版本)中执行此操作。

You can simply change the logging level for scrapy (or any other logger): 您只需更改scrapy(或任何其他记录器)的日志记录级别:

logging.getLogger('scrapy').setLevel(logging.WARNING)

This disables all log messages less than the WARNING level. 这将禁用低于WARNING级别的所有日志消息。

To disable all scrapy log messages you can just set propagate to False : 要禁用所有scrapy日志消息,只需将propagate设置为False

logging.getLogger('scrapy').propagate = False

This prevents scrapy's log messages from propagating to the root logger (which prints to console when configured using basicConfig() ) 这可以防止scrapy的日志消息传播到根记录器(使用basicConfig()配置时打印到控制台)

You could add -s LOG_ENABLED=False as a parameter when launching your script. 启动脚本时,可以添加-s LOG_ENABLED=False作为参数。 That should do the trick. 这应该够了吧。

Note : For the version 1.1 changed a little bit: -s LOG_ENABLED=0 注意 :对于版本1.1改变了一点: -s LOG_ENABLED=0

logging.basicConfig(**kwargs) logging.basicConfig(** kwargs)

This function does nothing if the root logger already has handlers configured for it. 如果根记录器已经为其配置了处理程序,则此函数不执行任何操作。

Scrapy has handlers configured for it, so this will not work Scrapy已经为它配置了处理程序,因此这不起作用

you can simply add --nolog as a parameter when launching your spider using scrapy command 您可以使用scrapy命令在启动蜘蛛时简单地添加--nolog作为参数
I am using scrapy v1.7.3. 我正在使用scrapy v1.7.3。 you can see more in help using command: 你可以使用命令在help中看到更多:

scrapy --help

This might helpful, mentioned in the document that you can add this to setting file:这可能会有所帮助,在文档中提到您可以将其添加到设置文件中:

LOG_LEVEL = 'WARNING'

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何在 Python 中关闭 Discord 机器人的日志记录? - How to turn off logging for Discord bot in Python? 如何在Nginx for Flask(Python)应用程序中关闭Favicon和机器人的日志记录 - How to turn off logging for favicon and robots in nginx for flask (python) apps 如何在 Python 中的导入模块中关闭根记录器的日志记录? - How to turn off logging from root loggers in imported modules in Python? 如何在 Python 中使用 Chrome 关闭 Selenium 中的命令行日志记录 - How to turn off command line logging in Selenium using Chrome in Python 如何关闭 Spark 中的 INFO 日志记录? - How to turn off INFO logging in Spark? 如何彻底关闭 sqlalchemy - How to turn sqlalchemy logging off completely 如何使用python命令行工具关闭日志缓冲区以实时获取日志? - How to turn off logging buffer to get logs in real time with python command line tool? 在 Python Django 中启用调试时,如何仅关闭 SqlAlchemy 数据库池的日志记录? - How to turn off logging for SqlAlchemy db pool only, when debugging is enabled in Python Django? 如何关闭scrapy的ImagesPipeline以自动创建完整文件夹? - How to turn off scrapy's ImagesPipeline to automatically create a full folder? Python Scrapy:如何关闭集群的日志记录 - Python Scrapy: How to close cluster's logging
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM