简体   繁体   English

为什么在多个进程中使用python记录RotatingFileHandler时会丢失记录?

[英]why python logging RotatingFileHandler lost records when used in multiple processes?

Recently I've realize that my application generate less log records than I expected. 最近,我意识到我的应用程序生成的日志记录少于我的预期。 After some experiments I've found that problem is in RotatingFileHandler and multiprocessing. 经过一些实验,我发现问题出在RotatingFileHandler和多处理中。

import logging
from logging import handlers
from multiprocessing import Pool
import os


log_file_name = 'log.txt'
def make_logger():
    logger = logging.getLogger('my_logger')
    logger.setLevel(logging.INFO)

    current_handler_names = {handler.name for handler in logger.handlers}
    handler_name = 'my_handler'
    if handler_name in current_handler_names:
        return logger

    handler = handlers.RotatingFileHandler(
        log_file_name, maxBytes=10 * 2 ** 10, backupCount=0)
    handler.setLevel(logging.INFO)
    handler.set_name(handler_name)

    logger.addHandler(handler)

    return logger



def f(x):
    logger = make_logger()
    logger.info('hey %s' % x)


if os.path.exists(log_file_name):
    os.unlink(log_file_name)

p = Pool(processes=30)
N = 1000
p.map(f, range(N))
with open(log_file_name, 'r') as f:
    print 'expected: %s, real: %s' % (N, f.read().count('hey'))

Output: 输出:

$ python main.py
expected: 1000, real: 943

What did I do wrong? 我做错了什么?

As it is well explained , 众所周知

Although logging is thread-safe, and logging to a single file from multiple threads in a single process is supported, logging to a single file from multiple processes is not supported 尽管日志记录是线程安全的,并且支持在单个进程中从多个线程登录到单个文件,但是不支持从多个进程向单个文件登录

In a few words, RotatingFileHandler simply closes and deletes the file from one process, then opens a new file. 简而言之, RotatingFileHandler仅从一个进程中关闭并删除该文件,然后打开一个新文件。 But other processes don't know about a new file descriptor and see that previous have been closed. 但是其他进程不知道新文件描述符,因此看到以前的文件描述符已关闭。 Only the process who managed to rotate the file first continues logging. 只有首先设法旋转文件的进程才继续记录日志。

In my answer to the similar question I've proposed to use logrotate daemon to rotate files aside of these processes. 对类似问题的回答中 ,我建议使用logrotate守护程序将文件旋转到这些进程之外。 It does not close the file descriptor, but just truncates the file. 不会关闭文件描述符,而只是截断文件。 Thus file remains the same and other processes can continue logging. 因此,文件保持不变,其他进程可以继续记录日志。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Python RotatingFileHandler 也在记录存档文件 - 如何将多个进程记录到单个文件 - Python RotatingFileHandler is logging also to archived file - how to log multiple processes to single file 使用多个模块的python日志记录并写入文件和RotatingFileHandler - Using python logging from multiple modules with writing to a file and RotatingFileHandler 使用 RotatingFileHandler 记录多个进程 - Using RotatingFileHandler to log from multiple processes Python从多个进程记录 - Python logging from multiple processes Django 即使使用 RotatingFileHandler 也可以一次记录到多个文件? - Django logging to multiple files at once even with RotatingFileHandler? 在Django网站上Python记录RotatingFileHandler的问题 - Problem with Python logging RotatingFileHandler in Django website 为什么Python的日志记录模块的RotatingFileHandler将重命名日志文件而不是创建一个新文件? - Why Python's RotatingFileHandler of logging module will rename log file instead of create a new one? Foreman中的多个python进程停止记录 - Multiple python processes in Foreman stop logging Python日志记录从多个进程重定向标准输出 - Python logging redirecting stdout from multiple processes 为什么Python RotatingFileHandler的“ maxBytes”不起作用 - Why does Python RotatingFileHandler's “maxBytes ” not work
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM