How to change python log level for unnamed logger? - python

To change the logging level of a dependent package that properly names its logger log = logging.getLogger(__name__) is easy: logging.getLogger("name.of.package").setLevel(logging.WARNING).
But if the 3rd party package doesn't name their logger and just logs messages using logging.info("A super loud annoying message!"), how do I change that level? Adding the getLogger(...).setLevel(..) doesn't seem to work because the logger isn't named. Is it possible to change the logging level output of just one package without changing the level for the entire logging module?

If the logger is not named, it just means it is the default logger. You can get it by calling logging.getLogger()
So to set the log level, do this:
logging.getLogger.setLevel(logging.INFO)

Related

python logging from multiple packages?

I am importing 2 modules I built into a Python script. I want to have logging from the script plus the both modules go into a single log file. The logging cookbook and related forum posts (example), show how to do it for one script and one imported module, with the limitation that they both use the same logger name. The module's logger is named after the module (using __name__), so the script can find and re-use it using the top-level package name.
So one script and one imported module can share a logger. But how do I attach the second imported module to that logger? And what if I want the script's log name to be distinct from the imported module's name? Here is the solution I came up with. It works, but it's ugly. I need three logger objects! Is there a better way?
# Code for module maps.publish.upload
import logging
logger = logging.getLogger(__name__)
class ServicePublisher(object):
def __init__(self):
logger.debug(f'start constructor {__class__.__name__}')
Class charts.staffing.report.StaffingReport is identical to ServicePublisher
# log_test.py script
import logging
from pathlib import Path
from charts.staffing.report import StaffingReport
from maps.publish.upload import ServicePublisher
filename = Path(__file__).parts[-1]
logger = logging.getLogger(filename)
logger.setLevel(logging.DEBUG)
fh = logging.FileHandler('log_test.log')
fh.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(name)s - %(levelname)s - %(message)s')
fh.setFormatter(formatter)
logger.addHandler(fh)
logger_maps = logging.getLogger('maps')
logger_maps.setLevel(logging.DEBUG)
logger_maps.addHandler(fh)
logger_charts = logging.getLogger('charts')
logger_charts.setLevel(logging.DEBUG)
logger_charts.addHandler(fh)
logger.debug(f'start {filename}')
publisher = ServicePublisher()
report = StaffingReport()
Here is the log output, exactly as expected:
log_test.py - DEBUG - start log_test.py
maps.publish.upload - DEBUG - start constructor ServicePublisher
charts.staffing.report - DEBUG - start constructor StaffingReport
Is there a way to use my logger object for all three cases, so I don't need logger_maps and logger_charts, while still maintaining distinct names in the log output?
I worked on this a bit more and didn't really come up with a better solution, but I'll share what I learned below.
1. I tried going to the root manager's logger dict and creating a logger for everything there. You don't need to create loggers for sub-modules in your package structure. E.g.: charts.staffing.report will have a parent logger charts.staffing, which has a parent charts, which is all you need, so I removed anything with a . in it.
logger_names = [name for name in logging.root.manager.loggerDict
if '.' not in name]
Then, I used logging.getLogger(<name>) on each name in the list and attached my handlers to it. This was so simple that I was in love with this solution, until I tested it some more. I was using debug level logging, and this turned on logging from every imported package in some APIs I use. Those APIs also import packages with loggers, so this got really spammy. By default, I think those loggers all bubble up to the root logger at the warning level, which seems more appropriate. So my first lesson is to be intentional about which loggers you tinker with, rather than just going through all of them.
2. As I mentioned above, Python bubbles logging up from sub-modules to parent modules' loggers. If I had it to do all over again, I would name all my packages starting with an org name prefix. For example, with packages charts and maps, if I work for the Acme Corporation, then I would name these acme.charts and acme.maps so I can just create a single logger acme that handles all my packages. That's harder once you have lots of code using your package, so it's something to consider when you start.
Note that the convention is to use the package name for your loggers, but the name is just a string and you can name them whatever you want. I could do something like this to add an artificial prefix that unifies all of them:
logger = logging.getLogger(f"acme.{__name__}")
Even though the user still imports my example packages as charts and maps, the loggers would be named acme.charts and acme.maps, which you could get with a single logger named acme. It's a good technical trick, but I would have to document and communicate that to my users, since it's not intuitive that a package charts would use a logger named acme.
3. I'm still creating a logger for each custom package I'm importing. I don't think it will have any significant overhead for my code, but might be something to consider at massive scale. At scale, you wouldn't want to be logging below warning level, which you get by default from your root logger, so this may all be moot.
4. To simplify my code, I wrote a little function that accepts a list of package names and a list of handlers as inputs. It gets a logger for each name and attaches the handlers. For one or two additional loggers, it's probably just as easy to create them in your code as it is to call my def. But if you had a big list of packages to create loggers for, then it would really clean up your code.

logging fails to configure some imported modules (including imports) (python logging)

I want to print all logging messages from all imported modules. Certain imported modules are not logging.
Note that all the files in the libraries I care about have calls
logger = logging.getLogger(__name__) at the top.
Sorry if this is an easy problem. I've looked through a lot of posts without success.
I observe that the loggers for some modules are not being updated by the call to basicConfig
import logging
import local_util # a local util file
from transformers import T5ForConditionalGeneration
for n in logging.root.manager.loggerDict:
print(logging.getLogger(n))
logging.basicConfig(level=logging.DEBUG)
console = logging.StreamHandler()
console.setLevel(logging.DEBUG)
formatter = logging.Formatter("[%(asctime)s] %(levelname)s::%(module)s::%(funcName)s() %(message)s")
console.setFormatter(formatter)
logging.getLogger('').addHandler(console)
for n in logging.root.manager.loggerDict:
print(logging.getLogger(n))
In the first call to print(logging.getLogger(n)) almost all loggers are set with level WARNING.
In the second call, most loggers are set to DEBUG (including the one for local_util), except for the transformers library which all remain at level WARNING.
I can get transformers messages to print if I manually cycle through all loggers and reset their levels. Even if I use force=True in the call to basicConfig, the loggers for the transformers library do not get updated.
basicConfig means to affect the root logger. If other child loggers don't set their level (which is default to NOTSET), they use the level of the root logger.
So if transformers has set a WARNING level on its own, it won't use DEBUG level of root.
You can set its level directly as the following:
transformers_logger = logging.getLogger('transformers')
transformers_logger.setLevel(logging.DEBUG)

Ignore logging.basicConfig in third party module

I am importing a third party module that I don't have control over and can't change the source. It does a logging.basicConfig() which is messing up my own logger configuration. I have tried giving a different 'name' to my logger using self.logger = logging.getLogger('some_random_name') which is resulting in every log message printed twice, once with my format, and once with the format set by the basicConfig in that third party module.
Is there a way to ignore the basicConfig from the imported module?
logging.basicConfig() implicitly add handlers to the root logger, so you can just delete those handlers by logging.root.handlers = [].
Further, the root logger may set to an unwanted level, you can also simply set it by logging.root.setLevel(what_ever_you_want).
Even further, if you call logging.info, logging.error etc. without configuring the root logger first, a basicConfig() is called internally, and so a StreamHandler will be implicitly added to the root logger.

How to disable imported module logging at root level

I'm importing a module that is logging information at a warning level. I think that the person who wrote this code is logging at a root level i.e. in the code is just:
import logging
logging.warn("foo")
I've tried the below code but it doesn't work, probably because the logging is sent to root or something.
logging.getLogger(module).setLevel(logging.ERROR)
Is there a way that I could disable this module's specific logging?
There are several things confusing here:
logging.getLogger(module).setLevel(logging.ERROR)
The ‘module’ here should be a string, and is usually the full-qualified name of the module. Ie: package.module.
Depending of the format, the name of the Logger is usually printed in the log. That way you can disable it easily.
For instance, if you have something like:
WARNING [foo.bar.baz] the message
The logger name should be foo.bar.baz.
But, if you think it is the root logger, then you can try:
logger = logging.getLogger()
logger .setLevel(logging.ERROR)
Another thing which can be confusing, is the Python warnings. Take a look at this answer to disable them: https://stackoverflow.com/a/14463362

How do I print out only log messages for a given logger?

Currently I am doing this in my code:
logger = logging.getLogger(__name__)
logger.info("something happened")
Then at the top of my main scripts I do this:
logging.basicConfig(level=logging.INFO)
Problem is that there are too many messages. Is there any way to restrict it to one or a few different loggers?
You can control individual loggers by name. (In your example, you used name, which will be the module name, so each logger will have a different name, module by module). You can use the logging config file to control the logging level of each logger individually. Have a look at the PEP:
http://www.python.org/dev/peps/pep-0282/

Categories