Should i use logging module or logging class? - python

I write big program with many modules. In same module I wish use logging. What best practice for logging in Python?
Should I use import standart logging module and use it in every my file:
#proxy_control.py
#!/usr/bin/env python3
import logging
class ProxyClass():
def check_proxy():
pass
logging.basicConfig(filename='proxy.log', level=logging.INFO)
logging.info('Proxy work fine')
Or maybe i should write one MyLog() class inherit from default logging and use it from all my other modules?
#proxy_control.py
#!/usr/bin/env python3
import MyLog
class ProxyClass():
def check_proxy():
pass
Mylog.send_log('Proxy work fine')
#my_log.py
#!/usr/env/bin python3
import logging
class MyLog():
def send_log(value):
pass

A typical convention is to define a logger at the top of every module that requires logging and then use that logger object throughout the module.
logger = logging.getLogger(__name__)
This way, loggers will follow your package names (ie. package.subpackage.module). This is useful in the logging module because loggers propagate messages upwards based on the logger name (ie. parent.child will pass messages up to parent). This means that you can do all your configuration at the top level logger and it will get messages from all the sub-loggers in your package. Or, if someone else is using your library, it will be very easy for them to configure which logging messages they get from your library because it will have a consistent namespace.
For a library, you typically don't want to show logging messages unless the user explicitly enables them. Python logging will automatically log to stderr unless you set up a handler, so you should add a NullHandler to your top-level logger. This will prevent the messages from automatically being printed to stderr.
logger = logging.getLogger(__name__)
logger.addHandler(logging.NullHandler())
NOTE - The NullHandler was added in Python2.7, for earlier versions, you'll have to implement it yourself.

Use the logging module, and leave logging configuration to your application's entry point (modules should not configure logging by themselves).

Related

How to use a single Python logging object throughout the Python application?

Python provides the logging module. We can use the logger in place of print and use its multiple log levels. The issue here is that when we use logger, we pass the log string into the logger object. This means that the logger object must be accessible from every function/method and class in the entire Python program.
logger = logging.getLogger('mylogger')
logger.info('This is a message from mylogger.')
Now my question is, for large Python programs that are possibly split across more than 1 source file and made up of a multitude of functions/methods and classes, how do we ensure that the same logger object is used everywhere to log messages? Or do I have the wrong idea on how the logging module is used?
Or do I have the wrong idea on how the logging module is used?
Yup, wrong idea.
for large Python programs that are ... split across more than 1 source file ... how do we ensure that the same logger object is used everywhere ?
That's not a goal.
We log messages so a maintenance engineer can pick up the pieces later.
Use one logger per module.
It is a feature that a logger will reveal the source module it came from.
We use that to rapidly narrow down "what code ran?" & "with what input values?"
This assists a maintainer in reproducing and repairing observed buggy output.
The usual idiom is to begin each module with
logger = logging.getLogger(__name__)
That way, logger.error(...) and similar calls
will reveal the name of the module reporting the error,
which lets maintainers rapidly focus on the code
leading up to the error.
We do desire that all loggers follow the same output format,
so that all loggers produce compatible messages that can
be uniformly parsed by e.g. some awk script.
Typically an "if main" clause invokes
basicConfig,
just once, which takes care of that.
I can tell you what I do! I create a module in my application called logs that is imported by the entrypoint, creates and configures a root logger, and exposes a get_logger function that gets a child of that root logger with a given name. Consider:
# logs.py
import logging
ROOT = logging.getLogger("root")
# configure that logger in some way
def get_logger(name: str):
return ROOT.getLogger(name)
Then in other modules, I do:
# lib.py
import logs
logger = logs.get_logger(__name__)
# the body of my module, calling logger.info or etc as needed
The important part to remember is that logs.py cannot have dependencies on anything else in the package or else you have to be very careful the order in which you import things, lest you get a circular dependency problem!

Python Logger shared with modules

I have a question on how to share the Python logging device when we have other modules.
Let's say that I have main.py that imports file protocol.py and declares a logger variable.
import logging
from protocol import Protocol
log = logging.getLogger(__name__)
log.debug("Currently in main.py")
protocol_obj_1 = Protocol(log)
protocol_obj_2 = Protocol()
Within protocol.py, let's say I wish to log additional data by sharing the same log variable.
class Protocol:
def __init__(self, log_var = None):
self.log_var = log_var
def logData(self):
if self.log_var is not None:
self.log_var.debug("Currently within class Protocol")
else:
print("Currently no logging")
Is this a Pythonic approach? Or are there better ways to share the logging device with other modules?
The most idiomatic Python way of creating a logger in a module is, as the documentation says:
A good convention to use when naming loggers is to use a module-level logger, in each module which uses logging, named as follows:
logger = logging.getLogger(__name__)
But there aren't rare cases where you want different modules to log into the same logger. For instance, when it's done in a Python (distribution) package (say, distributed on PyPI). Then, if the package is "flat" then you can use __package__ instead of __name__ in every module.
logger = logging.getLogger(__package__)
If you have import packages in your distribution package, then you may want to provide hardcoded names in every module.
logger = logging.getLogger('mypackage')

How do you use logging in package which by default falls back to print in Python?

I want to change my print statements of my package to using logging. So I will write my scripts like
import logging
logger = logging.getLogger(__name__)
def func():
logger.info("Calling func")
which is the recommended way?
However, many users do not initialize logging and hence will not see the output.
Is there a recommended way so that users who do not initialize logging can see info output, and those who explicitly set up logging do not get their specific configuration tampered with by my package?
As a general rule of thumb, modules should never configure logging directly (and do other unsolicited changes to the shared STDOUT/STDERR either) as that's the realm of the module user. If the user wants the output, he should explicitly enable logging and then, and only then, your module should be allowed to log.
Bonus points if you provide an interface for explicitly enabling logging within your module so that the user doesn't have to explicitly change levels / disable loggers of your inner components if they're interested only in logging from their own code.
Of course, to keep using logging when a STDOUT/STDERR handler is not yet initialized you can use logging.NullHandler so all you have to do in your module is:
import logging
logger = logging.getLogger(__name__)
logger.addHandler(logging.NullHandler()) # initialize your logger with a NullHandler
def func():
logger.info("Calling func")
func() # (((crickets)))
Now you can continue using your logger throughout your module and until the user initializes logging your module won't trespass on the output, e.g.:
import your_module
your_module.func() # (((crickets)))
import logging
logging.root.setLevel(logging.INFO)
logging.info("Initialized!") # INFO:root:Initialized!
your_module.func() # INFO:your_module:Calling func
Of course, the actual logging format, level and other settings should also be in the realm of the user so if they change the format of the root log handler, it should be inherited by your module logger as well.

Custom Logging from multiple python modules

Task
I have logging setup across multiple modules in my application. All these modules send logs to the same file.
Client.py
import logging
import Settings
import actions1
import actions2
def initLogging(logToConsole=False):
global logger
logger=Settings.customLogging(Name,Dir,FileName,Level,CmdId,logToConsole)
logger.info("Starting client")
def initActions():
actions.init(logger)
Settings.py
import logging
import logging.handlers
def initLogging(name,dir,file,cmdId,logToConsole=false):
logger=logging.getLogger(name)
**setup the logger properties with loglevels, formatting etc**
return logger
actions1.py
def init(plogger):
global logger
logger=plogger
def some_function()
**do something**
logger.info("some text")
I have multiple actions1/2/3.py modules and I want to have different logging levels and different logging formats for each of these action scripts. I went through the docs and came across auxiliary module example but it doesn't specify how I can achieve custom settings for the same logger when it is being accessed by these separate scripts.
Any help on how to do it would be appreciated. I possibly want to initialize these custom settings when I invoke the init() method but I am currently out of ideas how to.
P.S. Please ignore the typos. It was a long code.
Instead of creating a single global logger you can use logging.getlogger(name) method from the logging module.
This way you will get a logger object. Now you can set the other parameters for the logger object.
This is what docs says:
The logger name hierarchy is analogous to the Python package
hierarchy, and identical to it if you organise your loggers on a
per-module basis using the recommended construction
logging.getLogger(__name__). That’s because in a module, __name__ is
the module’s name in the Python package namespace.

How to access a logger object from other modules in a web application

I have my main.py as follows:
import logging
import os
import web
def is_test():
if 'WEBPY_ENV' in os.environ:
return os.environ['WEBPY_ENV'] == 'test'
app = web.application(router.urls, globals())
logging.basicConfig(filename="log/debug.log", level=logging.INFO)
global logger
logger = logging.getLogger("debug")
if (not is_test()) and __name__ == "__main__":
app.run()
Now, here I have defined a variable named logger as global. So, can I access this variable anywhere in my application w/o redefining it? I am using web.py.
Basically what i need is something like this. I want to initialize the logger once and I should be able to use it anywhere in my whole application. How can i do that?
You don't need to access any global variable since the logging module already provides access to logger objects anywhere in your code, that is, if you use:
logger = logging.getLogger("debug")
in other modules, you'll get the same logger object as in the main module.
I don't fully understand why you're using a "debug" logger when you can adjust the level of your logger to get the debug messages, but maybe yours was just an example or you really need that, so let's get to the point:
From the official documentation: The key benefit of having the logging API provided by a standard library module is that all Python modules can participate in logging, so your application log can include your own messages integrated with messages from third-party modules.
This means that you just need to configure your logging at the beginning of your application and after that when you need a logger you call logging.getLogger(name) and you get that logger.
So there's no need of "tracking" your logger variable with:
global logger
logger = logging.getLogger("debug")
because whenever you need to log from your "debug" logger (in the middle of whatever you want) you just do something like:
my_debug_logger = logging.getLogger("debug")
my_debug_logger.info('some message')
At the end the point is that when you import the logging module import logging you have access to each and every logger previously defined (and of course you can define new ones).

Categories