简体   繁体   中英

Logging dictConfig does not work with multiprocessing

My problem is: Python logging configuration is not working using configuration file and multiprocessing.

I'm using python2.7.

Example

Here my example.py :

import json
import logging
import logging.config
import multiprocessing


def get_logger():
    with open("logging.conf") as fd:
        config_json_obj = json.load(fd)
    #logging.config.dictConfig(config_json_obj)
    logging.basicConfig(filename="my.log", level=logging.INFO)
    logger = logging.getLogger(__name__)
    return logger

def funSquare(num):
    logger = get_logger()
    # write logging info from process
    logger.info(num ** 2)
    return num ** 2


if __name__ == '__main__':
    logger = get_logger()

    logger.info("Start program")
    pool = multiprocessing.Pool()
    results = pool.map(funSquare, range(10))
    print(results)

And here my logging.conf file:

{
  "version": 1,
  "formatters":
  {
    "simple":
    {
      "format": "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
    },
    "verbose":
    {
      "format": "%(asctime)s %(thread)s %(levelname)s %(module)s [-] %(message)s"
    }
  },
  "handlers":
  {
    "console":
    {
      "class": "logging.StreamHandler",
      "level": "DEBUG",
      "formatter": "simple",
      "stream": "ext://sys.stdout"
    },
    "file":
    {
      "class": "logging.FileHandler",
      "formatter": "verbose",
      "level": "DEBUG",
      "filename": "mylog.log"
    }
  },
  "loggers":
  {
    "Main":
    {
      "level": "INFO",
      "handlers": ["console", "file"],
      "propagate": 0
    }
  },
  "root":
  {
    "level": "DEBUG",
    "handlers": ["file"]
  }
}

At this point, example.py is working. You will get in my.log file logging lines from both main program and new processes.

Reproduce my problem

Now, if I replace lines:

#logging.config.dictConfig(config_json_obj)
logging.basicConfig(filename="my.log", level=logging.INFO)

By

logging.config.dictConfig(config_json_obj)
#logging.basicConfig(filename="my.log", level=logging.INFO)

In my.log file you will see logging lines only from main program, not from processes.

Now my questions:

  1. Why this is happening?
  2. What is the solution?

Thank you for your help.

You shouldn't expect writing from multiple process into a single file to work (it may work in some situations, but can't be guaranteed). The correct way is to use a multiprocessing Queue and have all the file writing done by a single process, as outlined in this blog post from 2010 (too long to reproduce here). The important thing is that each process configures logging after it's created. You could adapt the linked example to use dictConfig rather than the programmatic configuration shown in the example, if you need to.

the problem is that __name__ will evaluate '__main__' ; you need to

  • either: rename "Main" in your config file to "__main__" (for the name of the logger)
  • or (maybe more robust): logger = logging.getLogger('Main') in your main file

the logger you are getting with logging.getLogger(__name__) is not configured otherwise.

In logging.DictConfig, the root logger is defined by "". Thus in your dictionnary, you should use:

"":
{
  "level": "DEBUG",
  "handlers": ["file"]
}

to properly set the root logger (and in the .ini file, you use "root", don't ask me why)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM