I am using logging module in python. In my main.py file I am using two logger.
Root logger (To get logs from multiple modules in same directory)
Local logger (To log specific information)
I want information of local logger to be separate from root logger. But when I am creating separate logger. Information of local logger is also present in root logger info.
Here is the sample of how I am doing this
# main.py
import logging
def setup_logger(filename, name = ''):
if name == '':
logging.basicConfig(filename=filename,
format='%(asctime)s %(funcName)s %(levelname)s %(message)s',
filemode='a')
logger = logging.getLogger()
else:
"""
handler = logging.FileHandler(filename, mode = 'a')
handler.setFormatter(logging.Formatter('%(asctime)s %(funcName)s %(levelname)s %(message)s'))
logger = logging.getLogger(name)
logger.addHandler(handler)
"""
formatter = logging.Formatter('%(asctime)s %(levelname)s %(message)s')
handler = logging.FileHandler(filename)
handler.setFormatter(formatter)
logger = logging.getLogger(name)
logger.setLevel(logging.DEBUG)
logger.addHandler(handler)
return logger
logger.setLevel(logging.DEBUG)
return logger
logger = setup_logger('main.log')
local_logger = setup_logger('local_log.log', 'local_log')
# other file under root log
logger = logging.getLogger("__main__." + __name__)
You have to stop propagation if you don't want the local loggers to send their logs to the root loggers handlers:
logger.propagate = False
This part of the documentation explains it well: https://docs.python.org/3/howto/logging.html#logging-flow
Related
I have written a logging utility where I have a default logger (let's say L1) which is initialized from a logconfig.yaml file and when L1 is used, the logs are written to "l1_logfile.log". In addition, I have defined another logger (let's say L2) with its own appender handler such that when used L2, the log messages are written to a file "l2_logfile.log".
The L1 logger initialized using the following static configuration (from logconfig.yaml file) and is initialized by using -- logging.config.dictConfig(config).
version: 1
formatters:
json_formatter:
format: '%(asctime)s %(levelname)s [%(threadName)s] %(name)s %(filename)s:%(funcName)s %(message)s'
class: pythonjsonlogger.jsonlogger.JsonFormatter
handlers:
file_handler:
class: logging.handlers.RotatingFileHandler
formatter: json_formatter
filename: ../logs/logfile.log
maxBytes: 10485760 # 10MB
backupCount: 20
loggers:
my_module:
level: DEBUG
handlers: [file_handler]
propagate: no
root:
level: DEBUG
handlers: [file_handler]
The L1 logger and its configurator are shown below.
def setup_logging(path):
with open(path, 'rt') as f:
try:
multiprocessing_logging.install_mp_handler()
config = yaml.safe_load(f.read())
logging.config.dictConfig(config)
except Exception as e:
print('Error in Logging Configuration. Using default configs')
logging.basicConfig(filemode='w', level=DEFAULT_LOG_LEVEL)
The L2 logger is a dynamic logger configuration and also had a custom file appender.
def get_custom_logger_appender(appender_name, logger_name):
logr_name = appender_name + logger_name
logger = logging.getLogger(logr_name)
format_str = '%(asctime)s %(levelname)s [%(threadName)s] %(name)s %(filename)s:%(funcName)s %(message)s'
formatter = jsonlogger.JsonFormatter(format_str)
log_file_name = os.path.join("./logs", '{}_{}.log'.format(appender_name, "appln"))
file_handler = logging.handlers.RotatingFileHandler(log_file_name,
maxBytes=100000,
backupCount=15)
file_handler.setFormatter(formatter)
file_handler.setLevel(logging.DEBUG)
logger.addHandler(file_handler)
logger.setLevel(logging.DEBUG)
adapter = CustomAdapter(logger, {'app_context': "app_context_value"})
return adapter
This is a unit test function.
class TestSum(unittest.TestCase):
// Defualt configurator.
setup_logging("./logconfig.yaml")
def test_appender_log_file_created(self):
appender_name = "appender"
logger_name = "testlogger"
self.logger = Logger.get_custom_logger_appender(appender_name, logger_name)
self.logger.info("This is info log 1")
self.logger.info("This is info log 2")
self.logger.info("This is info log 3")
base_dir = self.logger_config.log_base_dir()
log_file = base_dir + "/" + appender_name + '_' + LoggerConfig.appl_name() + ".log"
assert os.path.exists(log_file)
The issue is that three log messages are written to both log files "l1_logfile.log" and "l2_logfile.log" whereas these log messages should only be written to "l2_logfile.log" as they are written by the custom logger appender. What is wrong with my logic?
I solved it by implementing the get_logger() using another RotatingFileHandler. The issue was that the logger config (using the .yaml file) was also getting applied to the L2 logger instance by default. The below code helped to explicitly configure the logger.
def get_logger(logger_name):
log_file_name = os.path.join("logfile.log")
file_handler = logging.handlers.RotatingFileHandler(log_file_name,
mode='a',
maxBytes=10*1024*1024,
backupCount=15)
formatter = jsonlogger.JsonFormatter('%(asctime)s %(levelname)s [%(threadName)s] %(name)s %(filename)s:%(funcName)s %(message)s')
file_handler.setFormatter(formatter)
logger = logging.getLogger(logger_name)
logger.setLevel(logging.DEBUG)
logger.addHandler(file_handler)
adapter = CustomAdapter(logger, {'app_context': "app_context"})
Logger.__logger_dict[logger_name] = adapter
Logger.update_log_level_config(logger_name, logger.level)
return adapter
I am using logger in my python 2.7 project on a legacy code. I want to create logs on specific location but python logging module creates the log files at the default place i.e. from where it is executed.
Is there is any way to change this default location?
Below is the initialization of the logger.
logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)
formatter = logging.Formatter('%(message)s')
file_handler = logging.FileHandler('file.log')
file_handler.setFormatter(formatter)
logger.addHandler(file_handler)
Below is an example of how you can set the location and other properties of Python logger:
You can define a get_logger function as follows:
import logging
import os
LOG_DIR = 'log_dir'
LOG_FORMATTER = logging.Formatter('[%(asctime)s] %(levelname)s %(name)s: %(message)s')
def get_logger(log_name, log_dir = LOG_DIR):
if not os.path.isdir(log_dir):
os.mkdir(log_dir)
logger = logging.getLogger(log_name)
logging.basicConfig(level = logging.INFO)
log_handler = logging.FileHandler(os.path.join(log_dir, log_name))
logger.addHandler(log_handler)
log_handler.setFormatter(LOG_FORMATTER)
log_handler.setLevel('INFO')
return logger
Then in the file that you want to make logs, you can do as follows:
logger = get_logger('filename')
If you want to make a logging message, you can then do as follows:
logger.info('logging information!')
I have scripts parent.py and child.py (many childs) and I need to have logs for each, so any logging within parent.py should be in parent.log and child.py should be in child.log
I have the below in each script but I get empty logs... why??
#main.py
import child
handler = logging.FileHandler('logs/main.log')
handler.setLevel(logging.DEBUG)
formatter = logging.Formatter("%(asctime)s [%(filename)s:%(lineno)s - %
(funcName)10s()] %(levelname)s: %(message)s")
handler.setFormatter(formatter)
logger = logging.getLogger(__name__)
logger.addHandler(handler)
child.child_func()
logger.info('testing parent...')
#child.py
handler = logging.FileHandler('logs/child.log')
handler.setLevel(logging.DEBUG)
formatter = logging.Formatter("%(asctime)s [%(filename)s:%(lineno)s - %
(funcName)10s()] %(levelname)s: %(message)s")
handler.setFormatter(formatter)
logger = logging.getLogger(__name__)
logger.addHandler(handler)
def child_func():
logger.info('testing child...')
What I need to have is
#parent.log
{format} testing parent...
#child.log
{format} testing child...
The folks above are right about the default level on the loggers. Also, instead of spreading your configuration around everywhere, I find it more manageable to consolidate logging configuration to be early on in the application. See the example below.
Note: I don't expect this to be selected as an answer. I just wanted to point out what I believe is a better way of organizing the code.
main.py
import logging
import child
logger = logging.getLogger(__name__)
def setup_logging():
main_handler = logging.FileHandler('logs/main.log')
child_handler = logging.FileHandler('logs/child.log')
# Note that you can re-use the same formatter for the handlers.
formatter = logging.Formatter("%(asctime)s [%(filename)s:%(lineno)s - %(funcName)10s()] %(levelname)s: %(message)s")
main_handler.setFormatter(formatter)
child_handler.setFormatter(formatter)
# By default, loggers inherit the level from their parent, so we only need
# to set the level on the root logger if you want to have only one knob to
# control the level.
root_logger = logging.getLogger()
root_logger.setLevel(logging.DEBUG)
main_logger = logging.getLogger(__name__)
child_logger = logging.getLogger('child')
child_logger.propagate = False
main_logger.addHandler(main_handler)
child_logger.addHandler(child_handler)
def main():
setup_logging()
child.child_func()
logger.info('testing parent...')
if __name__ == '__main__':
main()
child.py
import logging
logger = logging.getLogger(__name__)
def child_func():
logger.info('testing child...')
Setting up a root logger and a child logger (no main)
Here's an example of setting up the root logger to log to logs/main.log, and the child logger to go to logs/child.log
def setup_logging():
root_handler = logging.FileHandler('logs/main.log')
child_handler = logging.FileHandler('logs/child.log')
# Note that you can re-use the same formatter for the handlers.
formatter = logging.Formatter("%(asctime)s [%(filename)s:%(lineno)s - %(funcName)10s()] %(levelname)s: %(message)s")
root_handler.setFormatter(formatter)
child_handler.setFormatter(formatter)
# By default, loggers inherit the level from their parent, so we only need
# to set the level on the root logger if you want to have only one knob to
# control the level.
root_logger = logging.getLogger()
root_logger.setLevel(logging.DEBUG)
root_logger.addHandler(root_handler)
child_logger = logging.getLogger('child')
child_logger.propagate = False
child_logger.addHandler(child_handler)
You can set severity-level on both handlers and loggers - I believe the logger is set to logging.WARNING by default, so you would only get warning-logs using your code.
You can read more in this thread: What is the point of setLevel in a python logging handler?
import logging
import child
handler = logging.FileHandler('logs/main.log')
formatter = logging.Formatter("%(asctime)s [%(filename)s:%(lineno)s - %(funcName)10s()] %(levelname)s: %(message)s")
handler.setFormatter(formatter)
logger = logging.getLogger(__name__)
logger.addHandler(handler)
logger.setLevel(logging.DEBUG) # <-- changed
child.child_func()
logger.info('testing parent...')
logger.warning('testing parent...')
logger.debug('testing parent...')
#child.py
import logging
handler = logging.FileHandler('logs/child.log')
formatter = logging.Formatter("%(asctime)s [%(filename)s:%(lineno)s - %(funcName)10s()] %(levelname)s: %(message)s")
handler.setFormatter(formatter)
logger = logging.getLogger(__name__)
logger.addHandler(handler)
logger.setLevel(logging.DEBUG) # <-- changed
def child_func():
logger.info('testing child...')
logger.warning('testing child...')
logger.debug('testing child...')
I am a python newbie, trying to implement logging into my code. I have two modules
main.py
submodule.py
main.py
import logging
from logging.handlers import RotatingFileHandler
import submodule
import logging
from logging.handlers import RotatingFileHandler
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
fh = RotatingFileHandler('master.log', maxBytes=2000000, backupCount=10)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
fh.setFormatter(formatter)
logger.addHandler(fh)
logger.debug('DEBUG LEVEL - MAIN MODULE')
logger.info('INFO LEVEL - MAIN MODULE')
submodule.loggerCall()
submodule.py
import logging
from logging.handlers import RotatingFileHandler
def loggerCall():
logger = logging.getLogger(__name__)
# logger.setLevel(logging.DEBUG)
fh = RotatingFileHandler('master.log', maxBytes=2000000, backupCount=10)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
fh.setFormatter(formatter)
logger.addHandler(fh)
logger.debug('SUBMODULE: DEBUG LOGGING MODE : ')
logger.info('Submodule: INFO LOG')
return
I thought as longs as I call the getLogger from my submodule, it should inherit the log level & handler details from root logger. However, in my case, I have to specify log level and handler again in submodule to get them print to same log file.
Also, If I have lots of methods, and classes inside my submodule. How can I go about it without having to define my log level & handler again.
Idea is to have a single log file with main, and sub modules printing in the same log based on the log level set in the main module.
The problem here is that you're not initializing the root logger; you're initializing the logger for your main module.
Try this for main.py:
import logging
from logging.handlers import RotatingFileHandler
import submodule
logger = logging.getLogger() # Gets the root logger
logger.setLevel(logging.DEBUG)
fh = RotatingFileHandler('master.log', maxBytes=2000000, backupCount=10)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
fh.setFormatter(formatter)
logger.addHandler(fh)
logger.debug('DEBUG LEVEL - MAIN MODULE')
logger.info('INFO LEVEL - MAIN MODULE')
submodule.loggerCall()
Then try this for submodule.py:
def loggerCall():
logger = logging.getLogger(__name__)
logger.debug('SUBMODULE: DEBUG LOGGING MODE : ')
logger.info('Submodule: INFO LOG')
return
Since you said you wanted to send log messages from all your submodules to the same place, you should initialize the root logger and then simply use the message logging methods (along with setlevel() calls, as appropriate). Because there's no explicit handler for your submodule, logging.getLogger(__name__) will traverse the tree to the root, where it will find the handler you established in main.py.
Currently I'm using logging.getLogger().setLevel(logging.DEBUG) what I think is logging everything where logging level is => DEBUG Is that a correct assumption? I can see a difference when I set logging.DEBUG to logging.ERROR so I guess I'm correct.
Also how do I write these logging rows to a file?
This is a exmaple write log to file
import logging
logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)
# create a file handler
handler = logging.FileHandler('hello.log')
handler.setLevel(logging.INFO)
# create a logging format
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
handler.setFormatter(formatter)
# add the handlers to the logger
logger.addHandler(handler)
logger.info('Hello baby')
More detail:
http://victorlin.me/posts/2012/08/good-logging-practice-in-python/