How do I create logs from within a module? - python

I have two files- one is my script and another is a module I'm using.
In the module, I have this function-
def sumTwoInts(x, y):
logger.debug('Lets sum two ints')
return x + y
In my script, I want to instantiate the log file
import logging
import myModule
logging.basicConfig(filename = 'logs.log')
logger = logging.getLogger()
logger.setLevel(logging.DEBUG)
myModule.sumTwoInts(1, 3)
How do I get the logger.debug code to add to the logs.log file created in my script?

Assuming you add
import logging
logger = logging.getLogger(__name__)
and then run your main program, then logs.log will contain
DEBUG:myModule:Lets sum two ints
Isn't that what you want to achieve?

Your code is already working if the logger is defined in the module file.
Note that I removed the file option from the basicConfig to check the result on the console. If you set it back, the log result will be added to the file.
# myModule.py
import logging
logger = logging.getLogger(__name__)
def sumTwoInts(x, y):
logger.debug('Lets sum two ints')
return x + y
# main.py
import logging
import myModule
logging.basicConfig()
logger = logging.getLogger()
logger.setLevel(logging.DEBUG)
myModule.sumTwoInts(1, 3)
python main.py
# DEBUG:myModule:Lets sum two ints

Related

How do I use logfile created in Conftest.py in other files?

I have following setup
../test/dirA
../test/Conftest.py
../test/Test_1.py
../test/logging.config
Code for Conftest.py
import logging.config
from os import path
import time
config_file_path = path.join(path.dirname(path.abspath(__file__)), 'logging.conf')
log_file_path = path.join(path.dirname(path.abspath(__file__)), ‘logFile.log’)
logging.config.fileConfig(config_file_path)
logger = logging.getLogger(__name__)
fh = logging.FileHandler(log_file_path)
fh.setLevel(logging.DEBUG)
ch = logging.StreamHandler()
ch.setLevel(logging.ERROR)
# add the handlers to the logger
logger.addHandler(fh)
logger.addHandler(ch)
logging.info('done')
Code for ../test/Test_1.py
import logging
logger = logging.getLogger(__name__)
def test_foo():
logger = logging.getLogger(__name__)
logger.info("testing foo")
def test_foobar():
logger = logging.getLogger(__name__)
logger.info("testing foobar")
I need to see logs from both files in logFile.log but currently I see only log from conftest.py. What am I missing?
Also, I noticed that if I execute Conftest.py from test folder (one dir up), for some reason it does not see logging.config.
Is my usage for conftest for correct for logging?
How else I can achieve same result?
Thank you
Update:
I used approach described
https://blog.muya.co.ke/configuring-multiple-loggers-python/.
https://fangpenlin.com/posts/2012/08/26/good-logging-practice-in-python/
With some changes, this addresses my question.
Another lesson I learned that use
logging.getLogger(__name__)
at function level, not at module level.
You can see a lots of example out there (including this article, I did it just for giving example in short) get logger at module level. They looks harmless, but actually, there is a pitfall – Python logging module respects all created logger before you load the configuration from a file, if you get logger at the module level like this
Well, my first guess is that if you don't import the Conftest.py in the test_1.py, the logger setup code is not reached and executed, so the logging in the Test_1 file is in a default location log file.
Try this line to find the location of the log file:
logging.getLoggerClass().root.handlers[0].baseFilename
I used approach described at https://blog.muya.co.ke/configuring-multiple-loggers-python/.
With some changes, this addresses my question.

Logging from a child module ignores the added root logger stream handler

My goal is to redirect logging messages from a certain function into a file. This function is defined in another module. I added StreamHandler to a main logger, but a message from a child_call function is not saved to tmp.log as expected.
# main.py
import logging
import os
import sys
from child_logger import child_call
logging.basicConfig(format='%(name)s:%(filename)s:%(lineno)d:%(message)s',
level=logging.INFO, stream=sys.stdout)
logger = logging.getLogger(__name__)
logger.info('here it is')
with open('tmp.log', 'w') as f:
logger_stream_handler = logging.StreamHandler(stream=f)
logger_stream_handler.setLevel(logging.INFO)
logger.addHandler(logger_stream_handler)
logger.info('I am outer')
child_call()
logger.removeHandler(logger_stream_handler)
# child_logger.py
import logging
logger = logging.getLogger(__name__)
def child_call():
logger.info('I am inner')
Here is the output:
%python logger_test.py
__main__:logger_test.py:18:here it is
__main__:logger_test.py:25:I am outer
child_logger:child_logger.py:9:I am inner
%cat tmp.log
I am outer
I am expecting to see 'I am inner' in tmp.log. As far as I understood the logging module, there is a hierarchy of Logger objects created, messages from children should propagate to a root Logger by default and the root should handle all messages. What am I missing ?
The problem is that your loggers are not correctly chained. They need to have the same root name. For example:
# main.py
logger = logging.getLogger("parent")
# child_logger.py
logger = logging.getLogger("parent.child")
Both of your log retrievals just ask for a logger with __name__, which is set to the name of the module, except for the top level, which gets "__main__". You are ending up with the equivalent of this:
# main.py
logger = logging.getLogger("__main__")
# child_logger.py
logger = logging.getLogger("child_logger")
You need to enforce a common parent logging name scheme to create the correct logger inheritance hierarchy.

How should Python logging be accomplished using a logging object across multiple modules?

I want to create a Python logging object in my main program and have logging in both my main program and in the modules it uses at the same logging level. The basic example given in logging documentation is essentially as follows:
main.py:
import logging
import mylib
def main():
logging.basicConfig(level = logging.INFO)
logging.info('Started')
mylib.do_something()
logging.info('Finished')
if __name__ == '__main__':
main()
mylib.py:
import logging
def do_something():
logging.info('Doing something')
This works fine. I am not sure, however, of how to get a Python logging object doing something similar. The following, for example, does not work:
main.py:
import logging
import mylib
def main():
verbose = True
global log
log = logging.getLogger(__name__)
if verbose:
log.setLevel(logging.INFO)
else:
log.setLevel(logging.DEBUG)
log.info('Started')
mylib.do_something()
log.info('Finished')
if __name__ == '__main__':
main()
mylib.py:
import logging
def do_something():
log.info('Doing something')
It does not work because the global log object is not recognised in the mylib.py module. How should I be doing this? My two main goals are
to have the logging level that is set in my main program propagate through to any modules used and
to be able to use log for logging, not "logging" (i.e. log.info("alert") as opposed to logging.info("alert")).
Your application should configure the logging once (with basicConfig or see logging.config) and then in each file you can have:
import logging
log = logging.getLogger(__name__)
# further down:
log.info("alert")
So you use a logger everywhere and you don't directly access the logging module in your codebase. This allows you to configure all your loggers in your logging configuration (which, as said earlier, is configured once and loaded at the beginning of your application).
You can use a different logger in each module as follows:
import logging
LOG = logging.getLogger(__name__)
# Stuff
LOG.info("A log message")

Python global logging [duplicate]

This question already has answers here:
Python: logging module - globally
(5 answers)
Closed 6 years ago.
How do I make a Logger global so that I can use it in every module I make?
Something like this in moduleA:
import logging
import moduleB
log = logging.getLogger('')
result = moduleB.goFigure(5)
log.info('Answer was', result)
With this in moduleB:
def goFigure(integer):
if not isinstance(integer, int):
log.critical('not an integer')
else:
return integer + 1
Currently, I will get an error because moduleB does not know what log is. How do I get around that?
You could make your own logging "module" which instantiates the logger, than have all of your code import that instead. Think:
logger.py:
import logging
log = logging.getLogger('')
codeA.py:
from logger import log
log.info('whatever')
codeB.py:
from logger import log
log.warn('some other thing')
Creating a global logger which can be used to
create a new log file or
returns logger for a global log file.
Create a module called called myLogger.py : This will have the log creation code
myLogger.py:
import logging
def myLog(name, fname = 'myGlobalLog.log'):
'''Debug Log'''
logger = logging.getLogger(name);
logger.setLevel(logging.DEBUG)
fhan = logging.FileHandler(fname)
fhan.setLevel(logging.DEBUG)
logger.addHandler(fhan)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
fhan.setFormatter(formatter)
'''comment this to enable requests logger'''
#logger.disabled = True
return logger
Now to create a new log in your module say A.py
from myLogger import myLog
log = myLog(__name__, 'newLog.log')
log.debug("In new log file")
Thus you have to pass the file name along while getting the logger.
To use the global logger in A.py:
from myLogger import myLog
log = myLog(__name__)
log.debug("In myGlobalLog file")
Need not pass the file name in this case as we gonna use the global log.
A module has by default only access to builtin functions and builtin constants. For all other variables, functions... you have to use the keyword import.
Now for your concrete example, you can import the log-variable of moduleA in modulesB like this:
from moduleA import log
The following would be equivalent because the logging-module returns the same instance of the logger that was returned to moduleA:
import logging
log = logging.getLogger('')
Another solution for you could be to use the default-logger of the logging module like this:
logging.info("Hello")

Weird stuff happening while importing modules

I hate to give the question this heading but I actually don't know whats happening so here it goes.
I was doing another project in which I wanted to use logging module. The code is distributed among few files & instead of creating separate logger objects for seperate files, I thought of creating a logs.py with contents
import sys, logging
class Logger:
def __init__(self):
formatter = logging.Formatter('%(filename)s:%(lineno)s %(levelname)s:%(message)s')
stdout_handler = logging.StreamHandler(sys.stdout)
stdout_handler.setFormatter(formatter)
self.logger=logging.getLogger('')
self.logger.addHandler(stdout_handler)
self.logger.setLevel(logging.DEBUG)
def debug(self, message):
self.logger.debug(message)
and use this class like (in different files.)
import logs
b = logs.Logger()
b.debug("Hi from a.py")
I stripped down the whole problem to ask the question here. Now, I have 3 files, a.py, b.py & main.py. All 3 files instantiate the logs.Logger class and prints a debug message.
a.py & b.py imports "logs" and prints their debug message.
main.py imports logs, a & b; and prints it own debug message.
The file contents are like this: http://i.imgur.com/XoKVf.png
Why is debug message from b.py printed 2 times & from main.py 3 times?
Specify a name for the logger, otherwise you always use root logger.
import sys, logging
class Logger:
def __init__(self, name):
formatter = logging.Formatter('%(filename)s:%(lineno)s %(levelname)s:%(message)s')
stdout_handler = logging.StreamHandler(sys.stdout)
stdout_handler.setFormatter(formatter)
self.logger=logging.getLogger(name)
self.logger.addHandler(stdout_handler)
self.logger.setLevel(logging.DEBUG)
def debug(self, message):
self.logger.debug(message)
http://docs.python.org/howto/logging.html#advanced-logging-tutorial :
A good convention to use when naming loggers is to use a module-level
logger, in each module which uses logging, named as follows:
logger = logging.getLogger(__name__)
logging.getLogger('') will return exactly the same object each time you call it. So each time you instantiate a Logger (why use old-style classes here?) you are attaching one more handler resulting in printing to one more target. As all your targets pointing to the same thing, the last call to .debug() will print to each of the three StreamHandler objects pointing to sys.stdout resulting in three lines being printed.
First. Don't create your own class of Logger.
Just configure the existing logger classes with exising logging configuration tools.
Second. Each time you create your own class of Logger you also create new handlers and then attach the new (duplicating) handler to the root logger. This leads to duplication of messages.
If you have several modules that must (1) run stand-alone and (2) also run as part of a larger, composite, application, you need to do this. This will assure that logging configuration is done only once.
import logging
logger= logging.getLogger( __file__ ) # Unique logger for a, b or main
if __name__ == "__main__":
logging.basicConfig( stream=sys.stdout, level=logging.DEBUG, format='%(filename)s:%(lineno)s %(levelname)s:%(message)s' )
# From this point forward, you can use the `logger` object.
logger.info( "Hi" )

Categories