How to set logging level in python? - python

I have the following complete python (3.8.5) code example
import logging
L = logging.getLogger(__name__)
def main():
L.setLevel(logging.DEBUG)
L.warning("This is warn")
L.info("This is info")
L.debug("This is debug")
if __name__ == "__main__":
main()
which should print out all three phrases, but only prints the first one (for warning).
How to circumvent this bug?

You should switch the L.setLevel to logging.basicConfig(level=...)
e.g.:
def main():
logging.basicConfig(level=logging.DEBUG)
L.warning("This is warn")
L.info("This is info")
L.debug("This is debug")
The method you used, sets the logging level for this logger specifically, meaning the specific logger won't actually do anything when logging on lower leverls - it has nothing to do with the logging level of the process and printing. Using logging.basicConfig you config the process' logging level, so even if you have multiple loggers this level will determine which logging messages come out of the process.

Use logging.basicConfig
import logging
logging.basicConfig(level=logging.DEBUG)
L = logging.getLogger(__name__)
def main():
L.warning("This is warn")
L.info("This is info")
L.debug("This is debug")
if __name__ == "__main__":
main()

Related

logging to different files in different module

I'm trying to write a file python modules and I need to log the output of different module to different files.
module_a.py
import logging
logging.basicConfig(levle = logging.INFO, filename="a.log")
logger = logging.getLogger(__name__)
def function_a():
logger.info("this is function a")
if __name__ == "__main__":
logger.info("this is module a")
function_a()
module_b.py
import logging
import module_a
logging.basicConfig(levle = logging.INFO, filename="b.log")
logger = logging.getLogger(__name__)
def function_b():
module_a.function_a()
logger.info("this is function b")
if __name__ == "__main__":
logger.info("this is module b")
function_b()
What I want to do is, I wants module_b.py always log to b.log, and
module_a.py always log to a.log, no matter the module was called or imported and then called.
The issue I got now is,
If I run
python3 module_a.py
All good.
If I run
python3 module_b.py
then all logs will be logged into a.log. I understand when import module_a, logger got overwritten. So how can make sure these two modules can always log into the right files?
Thanks
Normally, logging rules are decided by top level scripts. Imported modules just log without worrying about where the logs go. You want something different. So stay clear of basicConfig which is just a helper, and have each module setup its own logger.
In this example, I pulled the setup into a function. Its an easy way to get rid of temporary variables like handler. You may want to add a formatter while you are at it. The function is the same in both scripts. You may want to pull it into a single common utilities module. Or maybe there is a reason to do things differently in different modules. There are so many ways to setup loggers that the function is really just a rough guess of what you'll do in your project.
module_a
import logging
def make_logger(name):
handler = logging.FileHandler(filename=name + ".log")
logger = logging.getLogger(name)
logger.addHandler(handler)
logger.setLevel(logging.INFO)
return logger
logger = make_logger("a")
def function_a():
logger.info("this is function a")
if __name__ == "__main__":
logger.info("this is module a")
function_a()
module_b
import logging
import module_a
def make_logger(name):
handler = logging.FileHandler(filename=name + ".log")
logger = logging.getLogger(name)
logger.addHandler(handler)
logger.setLevel(logging.INFO)
return logger
logger = make_logger("b")
def function_b():
module_a.function_a()
logger.info("this is function b")
if __name__ == "__main__":
logger.info("this is module b")
function_b()

Logging from a Python script is not writing to the file when using TimedRotatingFileHandler [duplicate]

I had a script with logging capabilities, and it stopped working (the logging, not the script). I wrote a small example to illustrate the problem:
import logging
from os import remove
from os.path import exists
def setup_logger(logger_name, log_file, level=logging.WARNING):
# Erase log if already exists
if exists(log_file):
remove(log_file)
# Configure log file
l = logging.getLogger(logger_name)
formatter = logging.Formatter('%(message)s')
fileHandler = logging.FileHandler(log_file, mode='w')
fileHandler.setFormatter(formatter)
streamHandler = logging.StreamHandler()
streamHandler.setFormatter(formatter)
l.setLevel(level)
l.addHandler(fileHandler)
l.addHandler(streamHandler)
if __name__ == '__main__':
setup_logger('log_pl', '/home/myuser/test.log')
log_pl = logging.getLogger('log_pl')
log_pl.info('TEST')
log_pl.debug('TEST')
At the end of the script, the file test.log is created, but it is empty.
What am I missing?
Your setup_logger function specifies a (default) level of WARNING
def setup_logger(logger_name, log_file, level=logging.WARNING):
...and you later log two events that are at a lower level than WARNING, and are ignored as they should be:
log_pl.info('TEST')
log_pl.debug('TEST')
If you change your code that calls your setup_logger function to:
if __name__ == '__main__':
setup_logger('log_pl', '/home/myuser/test.log', logging.DEBUG)
...I'd expect that it works as you'd like.
See the simple example in the Logging HOWTO page.

Logging in a Python script is not working: results in empty log files

I had a script with logging capabilities, and it stopped working (the logging, not the script). I wrote a small example to illustrate the problem:
import logging
from os import remove
from os.path import exists
def setup_logger(logger_name, log_file, level=logging.WARNING):
# Erase log if already exists
if exists(log_file):
remove(log_file)
# Configure log file
l = logging.getLogger(logger_name)
formatter = logging.Formatter('%(message)s')
fileHandler = logging.FileHandler(log_file, mode='w')
fileHandler.setFormatter(formatter)
streamHandler = logging.StreamHandler()
streamHandler.setFormatter(formatter)
l.setLevel(level)
l.addHandler(fileHandler)
l.addHandler(streamHandler)
if __name__ == '__main__':
setup_logger('log_pl', '/home/myuser/test.log')
log_pl = logging.getLogger('log_pl')
log_pl.info('TEST')
log_pl.debug('TEST')
At the end of the script, the file test.log is created, but it is empty.
What am I missing?
Your setup_logger function specifies a (default) level of WARNING
def setup_logger(logger_name, log_file, level=logging.WARNING):
...and you later log two events that are at a lower level than WARNING, and are ignored as they should be:
log_pl.info('TEST')
log_pl.debug('TEST')
If you change your code that calls your setup_logger function to:
if __name__ == '__main__':
setup_logger('log_pl', '/home/myuser/test.log', logging.DEBUG)
...I'd expect that it works as you'd like.
See the simple example in the Logging HOWTO page.

How should I configure logging in a script and then use that configuration in only my modules?

I want to find out how logging should be organised given that I write many scripts and modules that should feature similar logging. I want to be able to set the logging appearance and the logging level from the script and I want this to propagate the appearance and level to my modules and only my modules.
An example script could be something like the following:
import logging
import technicolor
import example_2_module
def main():
verbose = True
global log
log = logging.getLogger(__name__)
logging.root.addHandler(technicolor.ColorisingStreamHandler())
# logging level
if verbose:
logging.root.setLevel(logging.DEBUG)
else:
logging.root.setLevel(logging.INFO)
log.info("example INFO message in main")
log.debug("example DEBUG message in main")
example_2_module.function1()
if __name__ == '__main__':
main()
An example module could be something like the following:
import logging
log = logging.getLogger(__name__)
def function1():
print("printout of function 1")
log.info("example INFO message in module")
log.debug("example DEBUG message in module")
You can see that in the module there is minimal infrastructure written to import the logging of the appearance and the level set in the script. This has worked fine, but I've encountered a problem: other modules that have logging. This can result in output being printed twice, and very detailed debug logging from modules that are not my own.
How should I code this such that the logging appearance/level is set from the script but then used only by my modules?
You need to set the propagate attribute to False so that the log message does not propagate to ancestor loggers. Here is the documentation for Logger.propagate -- it defaults to True. So just:
import logging
log = logging.getLogger(__name__)
log.propagate = False

How should Python logging be accomplished using a logging object across multiple modules?

I want to create a Python logging object in my main program and have logging in both my main program and in the modules it uses at the same logging level. The basic example given in logging documentation is essentially as follows:
main.py:
import logging
import mylib
def main():
logging.basicConfig(level = logging.INFO)
logging.info('Started')
mylib.do_something()
logging.info('Finished')
if __name__ == '__main__':
main()
mylib.py:
import logging
def do_something():
logging.info('Doing something')
This works fine. I am not sure, however, of how to get a Python logging object doing something similar. The following, for example, does not work:
main.py:
import logging
import mylib
def main():
verbose = True
global log
log = logging.getLogger(__name__)
if verbose:
log.setLevel(logging.INFO)
else:
log.setLevel(logging.DEBUG)
log.info('Started')
mylib.do_something()
log.info('Finished')
if __name__ == '__main__':
main()
mylib.py:
import logging
def do_something():
log.info('Doing something')
It does not work because the global log object is not recognised in the mylib.py module. How should I be doing this? My two main goals are
to have the logging level that is set in my main program propagate through to any modules used and
to be able to use log for logging, not "logging" (i.e. log.info("alert") as opposed to logging.info("alert")).
Your application should configure the logging once (with basicConfig or see logging.config) and then in each file you can have:
import logging
log = logging.getLogger(__name__)
# further down:
log.info("alert")
So you use a logger everywhere and you don't directly access the logging module in your codebase. This allows you to configure all your loggers in your logging configuration (which, as said earlier, is configured once and loaded at the beginning of your application).
You can use a different logger in each module as follows:
import logging
LOG = logging.getLogger(__name__)
# Stuff
LOG.info("A log message")

Categories