Python Standard Module--logging

Keywords: Python JSON encoding Eclipse

Original: https://www.cnblogs.com/zhbzz2007/p/5943685.html

Introduction of 1 logging Module

logging module is a standard module built in Python, which is mainly used to output running logs. It can set the level of output logs, log save path, log file rollback, etc. Compared with print, it has the following advantages:

  1. By setting different log levels, only important information can be output in release version without displaying a large amount of debugging information.
  2. print outputs all information to standard output, seriously affecting developers to view other data from standard output; logging can be decided by developers where and how to output information;

2 logging module use

2.1 Basic Use

Configure the logging basic settings, and then output the logs in the console.

import logging
logging.basicConfig(level = logging.INFO,format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s')
logger = logging.getLogger(__name__)

logger.info("Start print log")
logger.debug("Do something")
logger.warning("Something maybe fail.")
logger.info("Finish")

When running, the console outputs.

logging.basicConfig(level = logging.DEBUG,format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s')

Console output, you can see, output debug information.

2016-10-09 19:12:08,289 - __main__ - INFO - Start print log
2016-10-09 19:12:08,289 - __main__ - DEBUG - Do something
2016-10-09 19:12:08,289 - __main__ - WARNING - Something maybe fail.
2016-10-09 19:12:08,289 - __main__ - INFO - Finish

Log.basicConfig function parameters:

filename: specify the log file name;

filemode: Same as the file function, specify the open mode of the log file,'w'or'a';

Format: Specify the format and content of the output. Form can output a lot of useful information.

PARAMETERS: EFFECT

(levelno)s: Print log level values
(level name) s: Print the name of the log level
(pathname)s: Print the path of the current executor, which is actually sys.argv[0]
(filename)s: Print the name of the current executor
(funcName)s: Current function for printing logs
(lineno)d: Print the current line number of the log
(asctime)s: time to print logs
(thread)d: Print thread ID
(threadName)s: Print the thread name
(process)d: Print process ID
(message)s: Print log information

datefmt: specify the time format, the same as time.strftime();

Level: Set the log level by logging.WARNNING by default.

Stream: specify the output stream of the log, you can specify output to sys.stderr, sys.stdout or file, default output to sys.stderr, when stream and filename are specified at the same time, stream is ignored;

2.2 Write logs to files

2.2.1 Write logs to files

Set logging, create a FileHandler, format the output message, add it to logger, and write the log to the specified file.

import logging
logger = logging.getLogger(__name__)
logger.setLevel(level = logging.INFO)
handler = logging.FileHandler("log.txt")
handler.setLevel(logging.INFO)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
handler.setFormatter(formatter)
logger.addHandler(handler)

logger.info("Start print log")
logger.debug("Do something")
logger.warning("Something maybe fail.")
logger.info("Finish")

The log data in log.txt is

2016-10-09 19:01:13,263 - __main__ - INFO - Start print log
2016-10-09 19:01:13,263 - __main__ - WARNING - Something maybe fail.
2016-10-09 19:01:13,263 - __main__ - INFO - Finish

2.2.2 Export logs to both screen and log files

Adding StreamHandler to logger can output the log to the screen.

import logging
logger = logging.getLogger(__name__)
logger.setLevel(level = logging.INFO)
handler = logging.FileHandler("log.txt")
handler.setLevel(logging.INFO)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
handler.setFormatter(formatter)

console = logging.StreamHandler()
console.setLevel(logging.INFO)

logger.addHandler(handler)
logger.addHandler(console)

logger.info("Start print log")
logger.debug("Do something")
logger.warning("Something maybe fail.")
logger.info("Finish")

You can see it in the log.txt file and console.

2016-10-09 19:20:46,553 - __main__ - INFO - Start print log
2016-10-09 19:20:46,553 - __main__ - WARNING - Something maybe fail.
2016-10-09 19:20:46,553 - __main__ - INFO - Finish

It can be found that logging has a main object for log processing, and other processing methods are added through addHandler. There are several handler s included in logging.

handler name: location; role

StreamHandler: logging.StreamHandler; log output to stream, can be sys.stderr, sys.stdout, or file
FileHandler: logging.FileHandler; log output to file
BaseRotating Handler: logging. handlers. BaseRotating Handler; basic log rollback
Rotating Handler: logging. handlers. Rotating Handler; log rollback mode to support maximum number of log files and log file rollback
TimeRotating Handler: logging. handlers. TimeRotating Handler; log rollback mode, rollback log files within a certain time zone
SocketHandler: logging.handlers.SocketHandler; remote output logs to TCP/IP sockets
DatagramHandler: logging.handlers.DatagramHandler; remote output logs to UDP sockets
SMTPHandler: logging.handlers.SMTPHandler; remote output log to mail address
SysLogHandler: logging.handlers.SysLogHandler; log output to syslog
NTEventLogHandler: logging.handlers.NTEventLogHandler; remote output log to Windows NT/2000/XP event log
MemoryHandler: logging.handlers.MemoryHandler; log output to specified buffer in memory
HTTPHandler: logging.handlers.HTTPHandler; remote output to HTTP server via "GET" or "POST"

2.2.3 Log Rollback

Log rollback can be achieved by using Rotating FileHandler.

import logging
from logging.handlers import RotatingFileHandler
logger = logging.getLogger(__name__)
logger.setLevel(level = logging.INFO)
#Define a RotatingFileHandler,Up to 3 log files are backed up, with a maximum of 1 log file per log file K
rHandler = RotatingFileHandler("log.txt",maxBytes = 1*1024,backupCount = 3)
rHandler.setLevel(logging.INFO)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
rHandler.setFormatter(formatter)

console = logging.StreamHandler()
console.setLevel(logging.INFO)
console.setFormatter(formatter)

logger.addHandler(rHandler)
logger.addHandler(console)

logger.info("Start print log")
logger.debug("Do something")
logger.warning("Something maybe fail.")
logger.info("Finish")

You can see the backup log files in the project directory.

2016/10/09  19:36               732 log.txt
2016/10/09  19:36               967 log.txt.1
2016/10/09  19:36               985 log.txt.2
2016/10/09  19:36               976 log.txt.3

2.3 Setting the level of messages

Different log levels can be set to control the output of the log.

Log Level: Scope of Use

FATAL: Fatal Error
CRITICAL: Particularly bad things, such as memory exhaustion, empty disk space, are rarely used
ERROR: When an error occurs, such as IO operation failure or connection problems
WARNING: Important events, but not errors, such as user login password errors
INFO: Handling routine tasks such as requests or state changes
DEBUG: Debugging using DEBUG levels, such as the intermediate state of each loop in the algorithm

2.4 Capture traceback

The traceback module in Python is used to track exception return information and can record traceback in logging.

Code,

import logging
logger = logging.getLogger(__name__)
logger.setLevel(level = logging.INFO)
handler = logging.FileHandler("log.txt")
handler.setLevel(logging.INFO)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
handler.setFormatter(formatter)

console = logging.StreamHandler()
console.setLevel(logging.INFO)

logger.addHandler(handler)
logger.addHandler(console)

logger.info("Start print log")
logger.debug("Do something")
logger.warning("Something maybe fail.")
try:
    open("sklearn.txt","rb")
except (SystemExit,KeyboardInterrupt):
    raise
except Exception:
    logger.error("Faild to open sklearn.txt from logger.error",exc_info = True)

logger.info("Finish")

Output in console and log.txt file,

Start print log
Something maybe fail.
Faild to open sklearn.txt from logger.error
Traceback (most recent call last):
  File "G:\zhb7627\Code\Eclipse WorkSpace\PythonTest\test.py", line 23, in <module>
    open("sklearn.txt","rb")
IOError: [Errno 2] No such file or directory: 'sklearn.txt'
Finish

You can also use logger.exception(msg,_args), which is equivalent to logger.error(msg,exc_info = True,_args).

take

logger.error("Faild to open sklearn.txt from logger.error",exc_info = True)

Instead,

logger.exception("Failed to open sklearn.txt from logger.exception")

Output in console and log.txt file,

Start print log
Something maybe fail.
Failed to open sklearn.txt from logger.exception
Traceback (most recent call last):
  File "G:\zhb7627\Code\Eclipse WorkSpace\PythonTest\test.py", line 23, in <module>
    open("sklearn.txt","rb")
IOError: [Errno 2] No such file or directory: 'sklearn.txt'
Finish

2.5 Multi-module logging

The main module mainModule.py,

import logging
import subModule
logger = logging.getLogger("mainModule")
logger.setLevel(level = logging.INFO)
handler = logging.FileHandler("log.txt")
handler.setLevel(logging.INFO)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
handler.setFormatter(formatter)

console = logging.StreamHandler()
console.setLevel(logging.INFO)
console.setFormatter(formatter)

logger.addHandler(handler)
logger.addHandler(console)


logger.info("creating an instance of subModule.subModuleClass")
a = subModule.SubModuleClass()
logger.info("calling subModule.subModuleClass.doSomething")
a.doSomething()
logger.info("done with  subModule.subModuleClass.doSomething")
logger.info("calling subModule.some_function")
subModule.som_function()
logger.info("done with subModule.some_function")

Submodule subModule.py,

import logging

module_logger = logging.getLogger("mainModule.sub")
class SubModuleClass(object):
    def __init__(self):
        self.logger = logging.getLogger("mainModule.sub.module")
        self.logger.info("creating an instance in SubModuleClass")
    def doSomething(self):
        self.logger.info("do something in SubModule")
        a = []
        a.append(1)
        self.logger.debug("list a = " + str(a))
        self.logger.info("finish something in SubModuleClass")

def som_function():
    module_logger.info("call function some_function")

After execution, output in the control and log file log.txt.

2016-10-09 20:25:42,276 - mainModule - INFO - creating an instance of subModule.subModuleClass
2016-10-09 20:25:42,279 - mainModule.sub.module - INFO - creating an instance in SubModuleClass
2016-10-09 20:25:42,279 - mainModule - INFO - calling subModule.subModuleClass.doSomething
2016-10-09 20:25:42,279 - mainModule.sub.module - INFO - do something in SubModule
2016-10-09 20:25:42,279 - mainModule.sub.module - INFO - finish something in SubModuleClass
2016-10-09 20:25:42,279 - mainModule - INFO - done with  subModule.subModuleClass.doSomething
2016-10-09 20:25:42,279 - mainModule - INFO - calling subModule.some_function
2016-10-09 20:25:42,279 - mainModule.sub - INFO - call function some_function
2016-10-09 20:25:42,279 - mainModule - INFO - done with subModule.some_function

Firstly, logger'mainModule'is defined in the main module, and it is configurated so that the objects obtained by getLogger('mainModule') are the same in other parts of the interpreter process, and can be used directly without reconfiguration. The child logger of the logger defined can share the definition and configuration of the parent logger. The so-called parent-child logger is identified by naming. Any logger starting with'mainModule'is its child logger, such as'mainModule.sub'.

To develop an application, we can first write the corresponding configuration of the application through the logging configuration file and generate a root logger, such as'Python APP', then load the logging configuration through fileConfig in the main function, and then use the root log in other parts of the application and in different modules. Subloggers of ger, such as'Python APP. Core','Python APP. Web', are used to log without having to define and configure loggers for each module over and over again.

3 Configure logging module with JSON or YAML file

Although logging can be configured in Python code, this is not flexible enough. The best way to configure logging is to use a configuration file. In Python 2.7 and later versions, logging configurations can be loaded from dictionaries, which means that log configurations can be loaded through JSON or YAML files.

3.1 Configuration via JSON file

JSON configuration file,

{
    "version":1,
    "disable_existing_loggers":false,
    "formatters":{
        "simple":{
            "format":"%(asctime)s - %(name)s - %(levelname)s - %(message)s"
        }
    },
    "handlers":{
        "console":{
            "class":"logging.StreamHandler",
            "level":"DEBUG",
            "formatter":"simple",
            "stream":"ext://sys.stdout"
        },
        "info_file_handler":{
            "class":"logging.handlers.RotatingFileHandler",
            "level":"INFO",
            "formatter":"simple",
            "filename":"info.log",
            "maxBytes":"10485760",
            "backupCount":20,
            "encoding":"utf8"
        },
        "error_file_handler":{
            "class":"logging.handlers.RotatingFileHandler",
            "level":"ERROR",
            "formatter":"simple",
            "filename":"errors.log",
            "maxBytes":10485760,
            "backupCount":20,
            "encoding":"utf8"
        }
    },
    "loggers":{
        "my_module":{
            "level":"ERROR",
            "handlers":["info_file_handler"],
            "propagate":"no"
        }
    },
    "root":{
        "level":"INFO",
        "handlers":["console","info_file_handler","error_file_handler"]
    }
}

Load the configuration file through JSON, and then configure logging through logging.dictConfig.

import json
import logging.config
import os

def setup_logging(default_path = "logging.json",default_level = logging.INFO,env_key = "LOG_CFG"):
    path = default_path
    value = os.getenv(env_key,None)
    if value:
        path = value
    if os.path.exists(path):
        with open(path,"r") as f:
            config = json.load(f)
            logging.config.dictConfig(config)
    else:
        logging.basicConfig(level = default_level)

def func():
    logging.info("start func")

    logging.info("exec func")

    logging.info("end func")

if __name__ == "__main__":
    setup_logging(default_path = "logging.json")
    func()

3.2 Configuration via YAML file

Configuration through YAML files looks more concise than JSON.

version: 1
disable_existing_loggers: False
formatters:
        simple:
            format: "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
handlers:
    console:
            class: logging.StreamHandler
            level: DEBUG
            formatter: simple
            stream: ext://sys.stdout
    info_file_handler:
            class: logging.handlers.RotatingFileHandler
            level: INFO
            formatter: simple
            filename: info.log
            maxBytes: 10485760
            backupCount: 20
            encoding: utf8
    error_file_handler:
            class: logging.handlers.RotatingFileHandler
            level: ERROR
            formatter: simple
            filename: errors.log
            maxBytes: 10485760
            backupCount: 20
            encoding: utf8
loggers:
    my_module:
            level: ERROR
            handlers: [info_file_handler]
            propagate: no
root:
    level: INFO
    handlers: [console,info_file_handler,error_file_handler]

Load the configuration file through YAML, and then configure logging through logging.dictConfig.

import yaml
import logging.config
import os

def setup_logging(default_path = "logging.yaml",default_level = logging.INFO,env_key = "LOG_CFG"):
    path = default_path
    value = os.getenv(env_key,None)
    if value:
        path = value
    if os.path.exists(path):
        with open(path,"r") as f:
            config = yaml.load(f)
            logging.config.dictConfig(config)
    else:
        logging.basicConfig(level = default_level)

def func():
    logging.info("start func")

    logging.info("exec func")

    logging.info("end func")

if __name__ == "__main__":
    setup_logging(default_path = "logging.yaml")
    func()

Posted by cbrian on Tue, 03 Sep 2019 03:03:38 -0700