easy way to change logger name for each logger instance - python

I have a root logging class that I created, which I'd like to use for each micro-service function that I'm deploying.
Example output log: [2023-01-01 13:46:26] - INFO - [utils.logger.<module>:5] - testaaaaa
The logger is defined in utils.logger so that's why it's showing that in the log, hence %(name)s.
Instead of using the same root logger name which is set with logger = logging.getLogger(__name__), how can I get the same structure in dot notation where the logger is being instantiated and called?
Even if I have to modify my logger class to accept a name parameter when initializing the object, that is fine. But I like the dot notation because I will have files like routes.users.functiona, routes.users.functionb, routes.database.functiona and so on.
So I want to show which module the logging came from. Can't seem to follow how logging is capturing the path when using __name__.
Also if you have any suggestions about making the following more robust :)
Here is my class:
import typing
import logging
class GlobalLogger:
MINIMUM_GLOBAL_LEVEL = logging.DEBUG
GLOBAL_HANDLER = logging.StreamHandler()
LOG_FORMAT = "[%(asctime)s] - %(levelname)s - [%(name)s.%(funcName)s:%(lineno)d] - %(message)s"
LOG_DATETIME_FORMAT = "%Y-%m-%d %H:%M:%S"
def __init__(self, level: typing.Union[int, str] = MINIMUM_GLOBAL_LEVEL):
self.level = level
self.logger = self._get_logger()
self.log_format = self._log_formatter()
self.GLOBAL_HANDLER.setFormatter(self.log_format)
def _get_logger(self):
logger = logging.getLogger(__name__)
logger.setLevel(self.level)
logger.addHandler(self.GLOBAL_HANDLER)
return logger
def _log_formatter(self):
return logging.Formatter(fmt=self.LOG_FORMAT, datefmt=self.LOG_DATETIME_FORMAT)

I think you misunderstood the logging concept.
The variable __name__ holds the name of Python package where this variable is used. I think in your case it will have the value where class GlobalLogger is defined. And it does not matter from where you will create instance of your class. => Check it with print command.
At the top of every of your Python module (*.py), initialize module logger simply by logger = logging.getLogger(__name__). In your access the logging methods via module variable logger.
The logging data are automatically propagated to the parent logger based on the dot notation. If you have module A.B.C, the data are propagated C -> B -> A.
The output handlers, formatting can be defined for every logger or just for root logger e.g.: A or just (empty) "".
I usually have function InitLogging, where I get handle to the root logger and setup the output handlers. You do not need setup logging handlers in every module.
Conclusion:
If you still want use your class, give name of your logger as named parameter into the __init__. When you create instance of your class use variable __name__ during initialization.
I think there is not needed to create so many specific logger classes. You can define the logger properties later.

Related

How to make python module logger work with any main logger?

I've developed a module that I would like to be imported into any script and automatically have the module logger hiearchy below the logger established in the main script, no matter what the main logger is called (root, main, etc.)
i.e.
a.py
import module
log = logging.getLogger()
log.info("Test Main")
test()
b.py
import module
log = logging.getLogger('main')
log.info("Test Main")
test()
module.py
mod_log = logging.getLogger(__name__)
def test():
mod_log.info("Test Mod")
If the script ran successfully, I would expect the following output. I just can't seem to get it to run?
a.py
ROOT - Test Main
ROOT.module - Test Mod
b.py
main - Test Main
main.module - Test Mod
This is not possible because there is no way to tell where a logger was defined and which one the "main" logger is supposed to be. The main script could define any number of loggers, including zero. The only logger that is guaranteed to exist is the root logger.
If you are fine with just taking a guess you could make your logger a child of whatever is the first logger that was created.
module.py
import logging
logger_name = __name__
if logging.root.manager.loggerDict:
parent = next(iter(logging.root.manager.loggerDict))
logger_name = parent + '.' + logger_name
mod_log = logging.getLogger(logger_name)
Implement a Singleton Design pattern of the Logger class - whose basically job is to add a formatter, setting logging level and filtering and finally returning an instance of the logger.
Then use this to create an object:
logger = Logger.__call__().get_logger()
print(Logger())
Following the above method will create one and only on instance of logger and no matter where you create the object of the Logger class, it will be the same old instance if it was created before.

Logging handlers disappears when I try to get a child logger with Logger.getChild() method

I have a root logger with some handlers on it (SysLogHandler lets say).
And I want to get a child logger from the root logger by calling the Logger.getChild(__name__), but when I do so, I got the new logger with no handlers on it. What I'm doing wrong?
Thanks!
import logging
from logging import handlers
sysh = handlers.SysLogHandler()
logger = logging.getLogger()
logger.addHandler(sysh)
logger.handlers # [<SysLogHandler (NOTSET)>] - everything is ok
c = logger.getChild('some_name')
c.handlers # [] where is the handler(SysLogHandler) from parent(root) logger???
getChild does the same thing that logging.getLogger does. Both call the getLogger method on the logging.Manager instance.The manager either returns an existing logger that has been registered in its loggerDict attribute or instantiates a new logger, registers and returns it. In the latter case, the new logger must be configured by the caller.
getChild is not meant to create a clone of the logger it was called on. It is intended as
[...] a convenience method, useful when the parent logger is named using
e.g. __name__ rather than a literal string.

extending logging.Logger module in python 3.5

I have been trying to create a new class of Logger by subclassing logging.Logger . Python version is 3.5
I have several modules in my application and I configure the logging only in the main module where I set the logger class using logging.setLoggerClass(...)
However when I retrieve the same Logger instance from some other module, it still creates a new instance of the Logger class and not the child class instance that I defined.
For example my code is :
# module 1
import logging
class MyLoggerClass(logging.getLoggerClass()):
def __init__(name):
super(MyLoggerClass, self).__init__(name)
def new_logger_method(...):
# some new functionality
if __name__ == "__main__":
logging.setLoggerClass(MyLoggerClass)
mylogger = logging.getLogger("mylogger")
# configuration of mylogger instance
# module 2
import logging
applogger = logging.getLogger("mylogger")
print(type(applogger))
def some_function():
applogger.debug("in module 2 some_function")
When this code is executed, I expect the applogger in module 2 to be of type MyLoggerClass. I intend to use the new_logger_method for some new functionality.
However since applogger is turning out to be of type logging.Logger, when the code is run it throws Logger has no attribute named new_logger_method.
Has anyone ever faced this issue?
Thanks in advance for any help!
Pranav
Instead of attempting to affect the global logger by changing the default logger factory, if you want your module to play nicely with any environment you should define a logger just for your module (and its children) and use it as a main logger for everything else deeper in your module structure. The trouble is that you explicitly want to use a different logging.Logger class than the default/globally defined one and the logging module doesn't provide an easy way to do context-based factory switching so you'll have to do it yourself.
There are many ways to do that but my personal preference is to be as explicit as possible and define your own logger module which you'll then import in your other modules in your package whenever you need to obtain a custom logger. In your case, you can create logger.py at the root of your package and do something like:
import logging
class CustomLogger(logging.Logger):
def __init__(self, name):
super(CustomLogger, self).__init__(name)
def new_logger_method(self, caller=None):
self.info("new_logger_method() called from: {}.".format(caller))
def getLogger(name=None, custom_logger=True):
if not custom_logger:
return logging.getLogger(name)
logging_class = logging.getLoggerClass() # store the current logger factory for later
logging._acquireLock() # use the global logging lock for thread safety
try:
logging.setLoggerClass(CustomLogger) # temporarily change the logger factory
logger = logging.getLogger(name)
logging.setLoggerClass(logging_class) # be nice, revert the logger factory change
return logger
finally:
logging._releaseLock()
Feel free to include other custom log initialization logic in it if you so desire. Then from your other modules (and sub-packages) you can import this logger and use its getLogger() to obtain a local, custom logger. For example, all you need in module1.py is:
from . import logger # or `from package import logger` for external/non-relative use
log = logger.getLogger(__name__) # obtain a main logger for this module
def test(): # lets define a function we can later call for testing
log.new_logger_method("Module 1")
This covers the internal use - as long as you stick to this pattern in all your modules/sub-modules you'll have the access to your custom logger.
When it comes to external use, you can write an easy test to show you that your custom logger gets created and that it doesn't interfere with the rest of the logging system therefore your package/module can be declared a good citizen. Under the assumption that your module1.py is in a package called package and you want to test it as a whole from the outside:
import logging # NOTE: we're importing the global, standard `logging` module
import package.module1
logging.basicConfig() # initialize the most rudimentary root logger
root_logger = logging.getLogger() # obtain the root logger
root_logger.setLevel(logging.DEBUG) # set root log level to DEBUG
# lets see the difference in Logger types:
print(root_logger.__class__) # <class 'logging.RootLogger'>
print(package.module1.log.__class__) # <class 'package.logger.CustomLogger'>
# you can also obtain the logger by name to make sure it's in the hierarchy
# NOTE: we'll be getting it from the standard logging module so outsiders need
# not to know that we manage our logging internally
print(logging.getLogger("package.module1").__class__) # <class 'package.logger.CustomLogger'>
# and we can test that it indeed has the custom method:
logging.getLogger("package.module1").new_logger_method("root!")
# INFO:package.module1:new_logger_method() called from: root!.
package.module1.test() # lets call the test method within the module
# INFO:package.module1:new_logger_method() called from: Module 1.
# however, this will not affect anything outside of your package/module, e.g.:
test_logger = logging.getLogger("test_logger")
print(test_logger.__class__) # <class 'logging.Logger'>
test_logger.info("I am a test logger!")
# INFO:test_logger:I am a test logger!
test_logger.new_logger_method("root - test")
# AttributeError: 'Logger' object has no attribute 'new_logger_method'

How to use python logging in multiple modules

I was wondering what the standard set up is for performing logging from within a Python app.
I am using the Logging class, and I've written my own logger class that instantiates the Logging class. My main then instantiates my logger wrapper class. However, my main instantiates other classes and I want those other classes to also be able to write to he log file via the logger object in the main.
How do I make that logger object such that it can be called by other classes? It's almost like we need some sort of static logger object to get this to work.
I guess the long and short of the question is: how do you implement logging within your code structure such that all classes instantiated from within main can write to the same log file? Do I just have to create a new logging object in each of the classes that points to the same file?
I don't know what you mean by the Logging class - there's no such class in Python's built-in logging. You don't really need wrappers: here's an example of how to do logging from arbitrary classes that you write:
import logging
# This class could be imported from a utility module
class LogMixin(object):
#property
def logger(self):
name = '.'.join([__name__, self.__class__.__name__])
return logging.getLogger(name)
# This class is just there to show that you can use a mixin like LogMixin
class Base(object):
pass
# This could be in a module separate from B
class A(Base, LogMixin):
def __init__(self):
# Example of logging from a method in one of your classes
self.logger.debug('Hello from A')
# This could be in a module separate from A
class B(Base, LogMixin):
def __init__(self):
# Another example of logging from a method in one of your classes
self.logger.debug('Hello from B')
def main():
# Do some work to exercise logging
a = A()
b = B()
with open('myapp.log') as f:
print('Log file contents:')
print(f.read())
if __name__ == '__main__':
# Configure only in your main program clause
logging.basicConfig(level=logging.DEBUG,
filename='myapp.log', filemode='w',
format='%(name)s %(levelname)s %(message)s')
main()
Generally it's not necessary to have loggers at class level: in Python, unlike say Java, the unit of program (de)composition is the module. However, nothing stops you from doing it, as I've shown above. The script, when run, displays:
Log file contents:
__main__.A DEBUG Hello from A
__main__.B DEBUG Hello from B
Note that code from both classes logged to the same file, myapp.log. This would have worked even with A and B in different modules.
Try using logging.getLogger() to get your logging object instance:
http://docs.python.org/3/library/logging.html#logging.getLogger
All calls to this function with a given name return the same logger instance. This means that logger instances never need to be passed between different parts of an application.
UPDATE:
The recommended way to do this is to use the getLogger() function and configure it (setting a handler, formatter, etc...):
# main.py
import logging
import lib
def main():
logger = logging.getLogger('custom_logger')
logger.setLevel(logging.INFO)
logger.addHandler(logging.FileHandler('test.log'))
logger.info('logged from main module')
lib.log()
if __name__ == '__main__':
main()
# lib.py
import logging
def log():
logger = logging.getLogger('custom_logger')
logger.info('logged from lib module')
If you really need to extend the logger class take a look at logging.setLoggerClass(klass)
UPDATE 2:
Example on how to add a custom logging level without changing the Logging class:
# main.py
import logging
import lib
# Extend Logger class
CUSTOM_LEVEL_NUM = 9
logging.addLevelName(CUSTOM_LEVEL_NUM, 'CUSTOM')
def custom(self, msg, *args, **kwargs):
self._log(CUSTOM_LEVEL_NUM, msg, args, **kwargs)
logging.Logger.custom = custom
# Do global logger instance setup
logger = logging.getLogger('custom_logger')
logger.setLevel(logging.INFO)
logger.addHandler(logging.FileHandler('test.log'))
def main():
logger = logging.getLogger('custom_logger')
logger.custom('logged from main module')
lib.log()
if __name__ == '__main__':
main()
Note that adding custom level is not recommended: http://docs.python.org/2/howto/logging.html#custom-levels
Defining a custom handler and maybe using more than one logger may do the trick for your other requirement: optional output to stderr.

Weird stuff happening while importing modules

I hate to give the question this heading but I actually don't know whats happening so here it goes.
I was doing another project in which I wanted to use logging module. The code is distributed among few files & instead of creating separate logger objects for seperate files, I thought of creating a logs.py with contents
import sys, logging
class Logger:
def __init__(self):
formatter = logging.Formatter('%(filename)s:%(lineno)s %(levelname)s:%(message)s')
stdout_handler = logging.StreamHandler(sys.stdout)
stdout_handler.setFormatter(formatter)
self.logger=logging.getLogger('')
self.logger.addHandler(stdout_handler)
self.logger.setLevel(logging.DEBUG)
def debug(self, message):
self.logger.debug(message)
and use this class like (in different files.)
import logs
b = logs.Logger()
b.debug("Hi from a.py")
I stripped down the whole problem to ask the question here. Now, I have 3 files, a.py, b.py & main.py. All 3 files instantiate the logs.Logger class and prints a debug message.
a.py & b.py imports "logs" and prints their debug message.
main.py imports logs, a & b; and prints it own debug message.
The file contents are like this: http://i.imgur.com/XoKVf.png
Why is debug message from b.py printed 2 times & from main.py 3 times?
Specify a name for the logger, otherwise you always use root logger.
import sys, logging
class Logger:
def __init__(self, name):
formatter = logging.Formatter('%(filename)s:%(lineno)s %(levelname)s:%(message)s')
stdout_handler = logging.StreamHandler(sys.stdout)
stdout_handler.setFormatter(formatter)
self.logger=logging.getLogger(name)
self.logger.addHandler(stdout_handler)
self.logger.setLevel(logging.DEBUG)
def debug(self, message):
self.logger.debug(message)
http://docs.python.org/howto/logging.html#advanced-logging-tutorial :
A good convention to use when naming loggers is to use a module-level
logger, in each module which uses logging, named as follows:
logger = logging.getLogger(__name__)
logging.getLogger('') will return exactly the same object each time you call it. So each time you instantiate a Logger (why use old-style classes here?) you are attaching one more handler resulting in printing to one more target. As all your targets pointing to the same thing, the last call to .debug() will print to each of the three StreamHandler objects pointing to sys.stdout resulting in three lines being printed.
First. Don't create your own class of Logger.
Just configure the existing logger classes with exising logging configuration tools.
Second. Each time you create your own class of Logger you also create new handlers and then attach the new (duplicating) handler to the root logger. This leads to duplication of messages.
If you have several modules that must (1) run stand-alone and (2) also run as part of a larger, composite, application, you need to do this. This will assure that logging configuration is done only once.
import logging
logger= logging.getLogger( __file__ ) # Unique logger for a, b or main
if __name__ == "__main__":
logging.basicConfig( stream=sys.stdout, level=logging.DEBUG, format='%(filename)s:%(lineno)s %(levelname)s:%(message)s' )
# From this point forward, you can use the `logger` object.
logger.info( "Hi" )

Categories