How could I use same python logger in different files? - python

in the main.py I configure the python logger like this:
import func
import os.path as osp
import logging
logfile = '{}.log'.format(time.strftime('%Y-%m-%d-%H-%M-%S'))
logfile = osp.join('./res/', logfile)
FORMAT = '%(levelname)s %(filename)s(%(lineno)d): %(message)s'
logging.basicConfig(level=logging.INFO, format=FORMAT, f ilename=logfile)
logger = logging.getLogger(__name__)
logger.addHandler(logging.StreamHandler())
This works fine in this main.py file, but I still need the use the same logger in my func.py file which is a module imported by my main.py. I still need to print the log message to the screen together with the same logger file as in main.py. How could I do this ?

You can just define it and import it as a simple solution. When you import you do have access to your definitions in the other file... Just look at the imports in the code you posted.

Related

I can't get Python logging module to write to the log file from different module

I have a project that consists of several modules. There is main module (main.py) that creates a TK GUI and loads the data. It passes this data to process.py which processes the data using functions from checks.py. I am trying to implement logging for all the modules to log to a file. In the main.py log messages are written to the log file but in the other modules they are only written to the console. I assume its to do with the import module line executing part of the code before the code in main.py has set up the logger, but i can't work out how to arrange it to avoid that. It seems like a reasonably common question on Stackoverflow, but i couldn't get the other answers to work for me. I am sure I am not missing much. Simplified code is shown below:
Moving the logging code inside and outside of various functions in the modules. The code I used to start me off is the code from Corey Schaffer's Youtube channel.
Main.py
import logging
from process import process_data
def main():
logger = logging.getLogger()
logger.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(asctime)s:%(name)s:%(message)s')
templogfile = tempfile.gettempdir() + '\\' + 'TST_HA_Debug.log'
file_handler = logging.FileHandler(templogfile)
file_handler.setLevel(logging.DEBUG)
file_handler.setFormatter(formatter)
stream_handler = logging.StreamHandler()
stream_handler.setFormatter(formatter)
logger.addHandler(file_handler)
logger.addHandler(stream_handler)
logger.debug('Logging has started') # This gets written to the file
process_data(data_object) # call process_data in process.py
process.py
import logging
def process_data(data):
logger = logging.getLogger(__name__)
logger.debug('This message is logged by process') #This wont get written to the log file but get written to the console
#do some stuff with data here and log some msgs
return
Main.py will write to the log file, process.py will only write to the console.
I've rewritten your scripts a little so that this code can stand alone. If I changed this too much let me know and I can revisit it. These two files are an example of having it log to file. Note my comments:
## main.py
import logging
from process import process_data
import os
def main():
# Give this logger a name
logger = logging.getLogger("Example")
logger.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(asctime)s:%(name)s:%(message)s')
# I changed this to the same directory, for convenience
templogfile = os.path.join(os.getcwd(), 'TST_HA_Debug.log')
file_handler = logging.FileHandler(templogfile)
file_handler.setLevel(logging.DEBUG)
file_handler.setFormatter(formatter)
stream_handler = logging.StreamHandler()
stream_handler.setFormatter(formatter)
logger.addHandler(file_handler)
logger.addHandler(stream_handler)
logging.getLogger("Example").debug('Logging has started') # This still gets written to the file
process_data() # call process_data in process.py
if __name__ == '__main__':
main()
## process.py
import logging
def process_data(data=None):
# make sure to grab the correct logger
logger = logging.getLogger("Example")
logger.debug('This message is logged by process') # This does get logged to file now
#do some stuff with data here and log some msgs
return
Why does this work? Because the module-level functions use the default root logger, which is not the one you've configured. For more details on this see these docs. There is a similar question that goes more into depth here.
By getting the configured logger before you start logging, you are able to log to the right configuration. Hope this helps!

How do I use logfile created in Conftest.py in other files?

I have following setup
../test/dirA
../test/Conftest.py
../test/Test_1.py
../test/logging.config
Code for Conftest.py
import logging.config
from os import path
import time
config_file_path = path.join(path.dirname(path.abspath(__file__)), 'logging.conf')
log_file_path = path.join(path.dirname(path.abspath(__file__)), ‘logFile.log’)
logging.config.fileConfig(config_file_path)
logger = logging.getLogger(__name__)
fh = logging.FileHandler(log_file_path)
fh.setLevel(logging.DEBUG)
ch = logging.StreamHandler()
ch.setLevel(logging.ERROR)
# add the handlers to the logger
logger.addHandler(fh)
logger.addHandler(ch)
logging.info('done')
Code for ../test/Test_1.py
import logging
logger = logging.getLogger(__name__)
def test_foo():
logger = logging.getLogger(__name__)
logger.info("testing foo")
def test_foobar():
logger = logging.getLogger(__name__)
logger.info("testing foobar")
I need to see logs from both files in logFile.log but currently I see only log from conftest.py. What am I missing?
Also, I noticed that if I execute Conftest.py from test folder (one dir up), for some reason it does not see logging.config.
Is my usage for conftest for correct for logging?
How else I can achieve same result?
Thank you
Update:
I used approach described
https://blog.muya.co.ke/configuring-multiple-loggers-python/.
https://fangpenlin.com/posts/2012/08/26/good-logging-practice-in-python/
With some changes, this addresses my question.
Another lesson I learned that use
logging.getLogger(__name__)
at function level, not at module level.
You can see a lots of example out there (including this article, I did it just for giving example in short) get logger at module level. They looks harmless, but actually, there is a pitfall – Python logging module respects all created logger before you load the configuration from a file, if you get logger at the module level like this
Well, my first guess is that if you don't import the Conftest.py in the test_1.py, the logger setup code is not reached and executed, so the logging in the Test_1 file is in a default location log file.
Try this line to find the location of the log file:
logging.getLoggerClass().root.handlers[0].baseFilename
I used approach described at https://blog.muya.co.ke/configuring-multiple-loggers-python/.
With some changes, this addresses my question.

How should Python logging be accomplished using a logging object across multiple modules?

I want to create a Python logging object in my main program and have logging in both my main program and in the modules it uses at the same logging level. The basic example given in logging documentation is essentially as follows:
main.py:
import logging
import mylib
def main():
logging.basicConfig(level = logging.INFO)
logging.info('Started')
mylib.do_something()
logging.info('Finished')
if __name__ == '__main__':
main()
mylib.py:
import logging
def do_something():
logging.info('Doing something')
This works fine. I am not sure, however, of how to get a Python logging object doing something similar. The following, for example, does not work:
main.py:
import logging
import mylib
def main():
verbose = True
global log
log = logging.getLogger(__name__)
if verbose:
log.setLevel(logging.INFO)
else:
log.setLevel(logging.DEBUG)
log.info('Started')
mylib.do_something()
log.info('Finished')
if __name__ == '__main__':
main()
mylib.py:
import logging
def do_something():
log.info('Doing something')
It does not work because the global log object is not recognised in the mylib.py module. How should I be doing this? My two main goals are
to have the logging level that is set in my main program propagate through to any modules used and
to be able to use log for logging, not "logging" (i.e. log.info("alert") as opposed to logging.info("alert")).
Your application should configure the logging once (with basicConfig or see logging.config) and then in each file you can have:
import logging
log = logging.getLogger(__name__)
# further down:
log.info("alert")
So you use a logger everywhere and you don't directly access the logging module in your codebase. This allows you to configure all your loggers in your logging configuration (which, as said earlier, is configured once and loaded at the beginning of your application).
You can use a different logger in each module as follows:
import logging
LOG = logging.getLogger(__name__)
# Stuff
LOG.info("A log message")

Python global logging [duplicate]

This question already has answers here:
Python: logging module - globally
(5 answers)
Closed 6 years ago.
How do I make a Logger global so that I can use it in every module I make?
Something like this in moduleA:
import logging
import moduleB
log = logging.getLogger('')
result = moduleB.goFigure(5)
log.info('Answer was', result)
With this in moduleB:
def goFigure(integer):
if not isinstance(integer, int):
log.critical('not an integer')
else:
return integer + 1
Currently, I will get an error because moduleB does not know what log is. How do I get around that?
You could make your own logging "module" which instantiates the logger, than have all of your code import that instead. Think:
logger.py:
import logging
log = logging.getLogger('')
codeA.py:
from logger import log
log.info('whatever')
codeB.py:
from logger import log
log.warn('some other thing')
Creating a global logger which can be used to
create a new log file or
returns logger for a global log file.
Create a module called called myLogger.py : This will have the log creation code
myLogger.py:
import logging
def myLog(name, fname = 'myGlobalLog.log'):
'''Debug Log'''
logger = logging.getLogger(name);
logger.setLevel(logging.DEBUG)
fhan = logging.FileHandler(fname)
fhan.setLevel(logging.DEBUG)
logger.addHandler(fhan)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
fhan.setFormatter(formatter)
'''comment this to enable requests logger'''
#logger.disabled = True
return logger
Now to create a new log in your module say A.py
from myLogger import myLog
log = myLog(__name__, 'newLog.log')
log.debug("In new log file")
Thus you have to pass the file name along while getting the logger.
To use the global logger in A.py:
from myLogger import myLog
log = myLog(__name__)
log.debug("In myGlobalLog file")
Need not pass the file name in this case as we gonna use the global log.
A module has by default only access to builtin functions and builtin constants. For all other variables, functions... you have to use the keyword import.
Now for your concrete example, you can import the log-variable of moduleA in modulesB like this:
from moduleA import log
The following would be equivalent because the logging-module returns the same instance of the logger that was returned to moduleA:
import logging
log = logging.getLogger('')
Another solution for you could be to use the default-logger of the logging module like this:
logging.info("Hello")

python logging filter

I am really confused about this filter thing in logging. I have read the docs and logging cookbook.
I have an application written in several files. Each file has a class and its exceptions.
- main file: mcm
- in mcm I import configurator and initiate its class
- in configurator I import rosApi and initiate its class
What I want to achieve:
- in main file, decide from which modules and its levels I want to log.
- one handler for all. configurable in main file
The idea is that I would like to turn on debugging of given modules on and off in one file, customizable per runtime with option passed to main file.
For example:
If I pass -d it will print (additionally) all debug information from configurator but not rosApi.
If I pass -D it will print all debug from configurator AND rosApi
What I do is create a logger module, something along these lines:
import os
import logging
logger = logging.getLogger()
fh = logging.handlers.RotatingFileHandler(logfile, maxBytes=10000000, backupCount=10)
fm = logging.Formatter('%(asctime)s %(module)s %(levelname)s %(message)s')
fh.setFormatter(fm)
logger.addHandler(fh)
level = os.environ['LOGGING'].upper()
level = getattr(logging, level)
logging.getLogger().setLevel(level)
then I do an import logger at the top of all my modules.

Categories