简体   繁体   中英

python3.6 multiprocessing logging

I looked at its code, somewhat puzzled.
Does python logging support multiprocessing?

Environment

python 3.6.6

logging 0.5.1.2

Test code

import time
import logging
from multiprocessing import Process, current_process, pool

print(logging.__version__)
# setup log
logger = logging.getLogger(__name__)
logging.basicConfig(level=logging.DEBUG,
                    format='%(asctime)s %(filename)s[line:%(lineno)d] %(levelname)s %(message)s',
                    datefmt='%a, %d %b %Y %H:%M:%S',
                    filename='test.log',
                    filemode='w')


def func(the_time, logger):
    proc = current_process()
    while True:
        if time.time() >= the_time:
            logger.info('proc name %s id %s %s' % (proc.name, proc.pid, str(proc.name)*5000))
            return



if __name__ == '__main__':

    the_time = time.time() + 5

    for x in range(1, 10):
        proc = Process(target=func, name=x, args=(the_time, logger))
        proc.start()

Problem

use Python 3.6.6, multiprocess writes to the same file without confusion.

if you use Python 2.7 causes confusion.

If you know the problem, please let me know. Thank you.

No matter the Python version, the different processes should write without synchronization so there should be clobbered records.

Python does not resolve this automatically for you but it has the tools to help.

In Python 3, use QueueHandler and QueueListener .

In Python 2 or 3, use multiprocessing-logging which works only on Linux.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM