简体   繁体   中英

Python Logging - dictConfig - logging destination for sub modules

I have a python logging server, two test applications and one shared module (submod.py) I want both applications to be able to send log events to the server and have the server decide how to store them into separate log files. This is fairly easily until the shared module starts to log, I do not know how to allow the server to identify which program the sub module is sending log events for to store to the correct log file.

My logging server is a slightly modified version of the code I found here

I tried to modify it to use a dictionary logging configuration similar to the following:

test_log.conf

"handlers": {
            "console": {
                "class": "logging.StreamHandler",
                "level": "DEBUG",
                "formatter": "complex",
                "stream": "ext://sys.stdout"
            },
            "test_01": {
                "class": "logging.handlers.RotatingFileHandler",
                "level": "INFO",
                "formatter": "complex",
                "filename": "test_01.log",
                "mode": "a",
                "backupCount": 5,
                "encoding": "utf8"
            },
            "test_02": {
                "class": "logging.handlers.RotatingFileHandler",
                "level": "INFO",
                "formatter": "complex",
                "filename": "test_02.log",
                "mode": "a",
                "backupCount": 5,
                "encoding": "utf8"
            },
            "file": {
                "class": "logging.handlers.RotatingFileHandler",
                "level": "INFO",
                "formatter": "complex",
                "filename": "root.log",
                "mode": "a",
                "backupCount": 5,
                "encoding": "utf8"
            }
        },
        "loggers": {
            "root": {
                "level": "INFO",
                "handlers": ["console", "file"]
            },
            "test_01":{
                "level": "INFO",
                "handlers": ["console", "test_01"]
            },
            "test_02": {
                "level": "INFO",
                "handlers": ["console", "test_02"]
            }
        }

test_01.py

main_logger = logging.getLogger('')
main_logger.setLevel(logging.DEBUG)

socketHandler = logging.handlers.SocketHandler('localhost', logging.handlers.DEFAULT_TCP_LOGGING_PORT)

main_logger.addHandler(socketHandler)

logging.info('Test 01 main program')

a = submod.SubClass()

test_02.py

main_logger = logging.getLogger('')
main_logger.setLevel(logging.DEBUG)

socketHandler = logging.handlers.SocketHandler('localhost', logging.handlers.DEFAULT_TCP_LOGGING_PORT)

main_logger.addHandler(socketHandler)

logging.info('Test 02 main program')

a = submod.SubClass()

submod.py

class SubClass(object):
    def __init__(self):
        log = logging.getLogger()
        log.debug('Debug')
        log.info('Info')
        log.warn('Warning')
        log.error('Error')
        log.critical('Critical')
        print(__name__)

How can I have the logging server intelligently know where to log messages from submod.py when both test_01 and test_02 are calling it.

I apologize for formatting and the confusing explanation, the problem has caused brain-damage at this point for me.

Edited: For clarity and re-wording a bad explanation.

Just use a config file where you can predefine the destination of your logging file based on the program that uses it. Python "logging" module accomplishes all the tasks you need; here is an example of a config file: http://www.zetadev.com/software/aspen/trunk/doc/html/logging-conf.html

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM