简体   繁体   English

使用 Loguru 日志库记录导入文件中的请求

[英]Use Loguru logging library to log requests made in an imported file

So I have program written in a python file (main.py) that uses classes in an api wrapper file (bluerev.py).所以我在 python 文件 (main.py) 中编写了程序,该文件使用 api 包装文件 (bluerev.py) 中的类。 I want to use the loguru logger in main.py to collect all exceptions from the program + all requests made in the api wrapper.我想使用 main.py 中的 loguru 记录器来收集程序中的所有异常 + 在 api 包装器中发出的所有请求。 So the logging set up in the bluerev.py api wrapper looks like this:所以 bluerev.py api 包装器中设置的日志记录如下所示:

import logging

#Logging setup
logger = logging.getLogger(__name__)
logger.addHandler(logging.NullHandler())

class BluerevApiRequestHandler:

def __init__(self):
    self.date_str_format = "%Y-%m-%dT%H:%M:%S.%f"

@staticmethod
def setup_logger_for_requests():
    """
    Sets up the requests library to log api requests
    """
    logger.setLevel(logging.DEBUG)
    requests_log = logging.getLogger("requests.packages.urllib3")
    requests_log.setLevel(logging.DEBUG)
    requests_log.propagate = True

THe main.py logging code looks like this: main.py 日志记录代码如下所示:

from blurev import *
from loguru import logger
#more imports and code

@logger.catch
def main():
    # associated file and credential locations
    gmail_creds_file = "gmail_creds.json"
    revu_creds_file = r"revu_credentials.json"
    token_file = r"revu_refresh_token.json"
    files_location = os.path.join(os.getcwd(), "rev_dist_files")
    review_files_location = os.path.join(os.getcwd(), "files_to_distribute")

    # Set up Logging
    logging_file = r"plan_review_distributor.log"
    #logging_format_str = "%(levelname)s - %(asctime)-15s - %(filename)s - line %(lineno)d --> %(message)s"
    logger.add(os.path.join(files_location, logging_file),  level="WARNING")
    logger.add(os.path.join(files_location, logging_file), filter="blurev", level="DEBUG")

    #call the requests logging function here in main()
   request_handler = BluerevApiRequestHandler()
   request_handler.setup_logger_for_requests()

   #more code

So I am wondering how this should be changed so that requests being called in the blurev.py file by functions and code in main.py are being logged into the logging_file.所以我想知道这应该如何改变,以便通过 main.py 中的函数和代码在 blurev.py 文件中调用的请求被记录到 logging_file 中。 Right now they are not.现在他们不是。

The problem is that loguru uses a completely different mechanism for logging than the classic logging library.问题是loguru使用与经典logging库完全不同的日志记录机制。 The classic logging library constructs a hierarchy of loggers, and log records are propagate up to the root (see the Advanced logging tutorial from the Python docs for reference).经典的logging库构建了一个层次结构的记录器,并且日志记录传播到根目录(参见 Python 文档中的高级日志教程以供参考)。 But loguru does not use at all this hierarchy, it operates in a totally disjointed way from it.但是loguru根本不使用这种层次结构,它的运作方式与它完全脱节。
So if you want the logs emitted using the classic logging library to be ultimately handled by loguru , you have to intercept them.因此,如果您希望使用经典logging库发出的日志最终由loguru处理,则必须拦截它们。

Here is a Minimal Reproducible Example of my solution:这是我的解决方案的最小可重现示例

# main.py

import logging as classic_logging
import os

from blurev import BluerevApiRequestHandler
from loguru import logger as loguru_logger


@loguru_logger.catch
def main():
    logging_file = r"plan_review_distributor.log"
    loguru_logger.add(os.path.join(".", logging_file), filter=lambda rec: rec["name"] != "blurev", level="WARNING")
    loguru_logger.add(os.path.join(".", logging_file), filter="blurev", level="DEBUG")

    root_logger = classic_logging.getLogger("")
    root_logger.handlers = [InterceptHandler()]  # replace ANY pre-existing handler
    root_logger.setLevel("DEBUG")

    request_handler = BluerevApiRequestHandler()
    request_handler.do_something_using_requests()


########################################################################################################################
# recipe from https://loguru.readthedocs.io/en/stable/overview.html#entirely-compatible-with-standard-logging
class InterceptHandler(classic_logging.Handler):
    def emit(self, record):
        # Get corresponding Loguru level if it exists
        try:
            level = loguru_logger.level(record.levelname).name
        except ValueError:
            level = record.levelno

        # Find caller from where originated the logged message
        frame, depth = classic_logging.currentframe(), 2
        while frame.f_code.co_filename == classic_logging.__file__:
            frame = frame.f_back
            depth += 1

        loguru_logger.opt(depth=depth, exception=record.exc_info).log(level, record.getMessage())
########################################################################################################################


if __name__ == "__main__":
    main()
# bluerev.py

import logging as classic_logging

import requests

bluerev_logger = classic_logging.getLogger(__name__)


class BluerevApiRequestHandler:

    def do_something_using_requests(self):
        print(requests.get("http://example.com").text.splitlines()[3])  # expecting: "    <title>Example Domain</title>"
        classic_logging.getLogger("requests.packages.urllib3").warning("warning from requests")  # simulating
        bluerev_logger.debug("debug from bluerev")
        bluerev_logger.critical("critical from bluerev")

The idea is to add an handler to the classic root logger, which will transmit the log records to the loguru logger.这个想法是为经典的根记录器添加一个处理程序,它将日志记录传输到 loguru 记录器。 We don't need to setup_logger_for_requests , we just let the log record propagate naturally from "requests.packages.urllib3" up the hierarchy.我们不需要setup_logger_for_requests ,我们只是让日志记录自然地从"requests.packages.urllib3"向上传播。
I added a filter to the first loguru sink so that a "blurev" log record does not get written twice to the file (because caught by the two sinks).我在第一个loguru中添加了一个过滤器,这样"blurev"日志记录就不会被两次写入文件(因为被两个接收器捕获)。

This is the output I get:这是我得到的 output:

# stdout

2022-01-07 11:58:02.480 | DEBUG    | urllib3.connectionpool:_new_conn:227 - Starting new HTTP connection (1): example.com:80
2022-01-07 11:58:02.653 | DEBUG    | urllib3.connectionpool:_make_request:452 - http://example.com:80 "GET / HTTP/1.1" 200 648
2022-01-07 11:58:02.654 | WARNING  | blurev:do_something_using_requests:14 - warning from requests
2022-01-07 11:58:02.654 | DEBUG    | blurev:do_something_using_requests:15 - debug from bluerev
2022-01-07 11:58:02.654 | CRITICAL | blurev:do_something_using_requests:16 - critical from bluerev
    <title>Example Domain</title>
# plan_review_distributor.log

2022-01-07 11:58:02.654 | WARNING  | blurev:do_something_using_requests:14 - warning from requests
2022-01-07 11:58:02.654 | DEBUG    | blurev:do_something_using_requests:15 - debug from bluerev
2022-01-07 11:58:02.654 | CRITICAL | blurev:do_something_using_requests:16 - critical from bluerev

See Loguru's documentation section "Entirely compatible with standard logging" .请参阅Loguru 的文档部分“与标准日志记录完全兼容”
Also, you sometimes wrote "bluerev", some other times "bluerev", better watch out !此外,您有时会写“bluerev”,有时会写“bluerev”,请小心!

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM