This question already has answers here:
Python: logging module - globally
(5 answers)
Closed 6 years ago.
How do I make a Logger global so that I can use it in every module I make?
Something like this in moduleA:
import logging
import moduleB
log = logging.getLogger('')
result = moduleB.goFigure(5)
log.info('Answer was', result)
With this in moduleB:
def goFigure(integer):
if not isinstance(integer, int):
log.critical('not an integer')
else:
return integer + 1
Currently, I will get an error because moduleB does not know what log is. How do I get around that?
You could make your own logging "module" which instantiates the logger, than have all of your code import that instead. Think:
logger.py:
import logging
log = logging.getLogger('')
codeA.py:
from logger import log
log.info('whatever')
codeB.py:
from logger import log
log.warn('some other thing')
Creating a global logger which can be used to
create a new log file or
returns logger for a global log file.
Create a module called called myLogger.py : This will have the log creation code
myLogger.py:
import logging
def myLog(name, fname = 'myGlobalLog.log'):
'''Debug Log'''
logger = logging.getLogger(name);
logger.setLevel(logging.DEBUG)
fhan = logging.FileHandler(fname)
fhan.setLevel(logging.DEBUG)
logger.addHandler(fhan)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
fhan.setFormatter(formatter)
'''comment this to enable requests logger'''
#logger.disabled = True
return logger
Now to create a new log in your module say A.py
from myLogger import myLog
log = myLog(__name__, 'newLog.log')
log.debug("In new log file")
Thus you have to pass the file name along while getting the logger.
To use the global logger in A.py:
from myLogger import myLog
log = myLog(__name__)
log.debug("In myGlobalLog file")
Need not pass the file name in this case as we gonna use the global log.
A module has by default only access to builtin functions and builtin constants. For all other variables, functions... you have to use the keyword import.
Now for your concrete example, you can import the log-variable of moduleA in modulesB like this:
from moduleA import log
The following would be equivalent because the logging-module returns the same instance of the logger that was returned to moduleA:
import logging
log = logging.getLogger('')
Another solution for you could be to use the default-logger of the logging module like this:
logging.info("Hello")
Related
I have two files- one is my script and another is a module I'm using.
In the module, I have this function-
def sumTwoInts(x, y):
logger.debug('Lets sum two ints')
return x + y
In my script, I want to instantiate the log file
import logging
import myModule
logging.basicConfig(filename = 'logs.log')
logger = logging.getLogger()
logger.setLevel(logging.DEBUG)
myModule.sumTwoInts(1, 3)
How do I get the logger.debug code to add to the logs.log file created in my script?
Assuming you add
import logging
logger = logging.getLogger(__name__)
and then run your main program, then logs.log will contain
DEBUG:myModule:Lets sum two ints
Isn't that what you want to achieve?
Your code is already working if the logger is defined in the module file.
Note that I removed the file option from the basicConfig to check the result on the console. If you set it back, the log result will be added to the file.
# myModule.py
import logging
logger = logging.getLogger(__name__)
def sumTwoInts(x, y):
logger.debug('Lets sum two ints')
return x + y
# main.py
import logging
import myModule
logging.basicConfig()
logger = logging.getLogger()
logger.setLevel(logging.DEBUG)
myModule.sumTwoInts(1, 3)
python main.py
# DEBUG:myModule:Lets sum two ints
I have a logging function with hardcoded logfile name (LOG_FILE):
setup_logger.py
import logging
import sys
FORMATTER = logging.Formatter("%(levelname)s - %(asctime)s - %(name)s - %(message)s")
LOG_FILE = "my_app.log"
def get_console_handler():
console_handler = logging.StreamHandler(sys.stdout)
console_handler.setFormatter(FORMATTER)
return console_handler
def get_file_handler():
file_handler = logging.FileHandler(LOG_FILE)
file_handler.setFormatter(FORMATTER)
return file_handler
def get_logger(logger_name):
logger = logging.getLogger(logger_name)
logger.setLevel(logging.DEBUG) # better to have too much log than not enough
logger.addHandler(get_console_handler())
logger.addHandler(get_file_handler())
# with this pattern, it's rarely necessary to propagate the error up to parent
logger.propagate = False
return logger
I use this in various modules this way:
main.py
from _Core import setup_logger as log
def main(incoming_feed_id: int, type: str) -> None:
logger = log.get_logger(__name__)
...rest of my code
database.py
from _Core import setup_logger as log
logger = log.get_logger(__name__)
Class Database:
...rest of my code
etl.py
import _Core.database as db
from _Core import setup_logger as log
logger = log.get_logger(__name__)
Class ETL:
...rest of my code
What I want to achieve is to always change the logfile's path and name on each run based on arguments passed to the main() function in main.py.
Simplified example:
If main() receives the following arguments: incoming_feed_id = 1, type = simple_load, the logfile's name should be 1simple_load.log.
I am not sure what is the best practice for this. What I came up with is probably the worst thing to do: Add a log_file parameter to the get_logger() function in setup_logger.py, so I can add a filename in main() in main.py. But in this case I would need to pass the parameters from main to the other modules as well, which I do not think I should do as for example the database class is not even used in main.py.
I don't know enough about your application to be sure this'll work for you, but you can just configure the root logger in main() by calling get_logger('', filename_based_on_cmdline_args), and stuff logged to the other loggers will be passed to the root logger's handlers for processing if the logger levels configured allow it. The way you're doing it now seems to open multiple handlers pointing to the same file, which seems sub-optimal. The other modules can just use logging.getLogger(__name__) rather than log.get_logger(__name__).
I have multiple python modules that I'd like to use the same logger while preserving the call hierarchy in those logs. I'd also like to do this with a logger whose name is the name of the calling module (or calling module stack). I haven't been able to work out how to get the name of the calling module except with messing with the stack trace, but that doesn't feel very pythonic.
Is this possible?
main.py
import logging
from sub_module import sub_log
logger = logging.getLogger(__name__)
logger.info("main_module")
sub_log()
sub_module.py
import logging
def sub_log():
logger = logging.getLogger(???)
logger.info("sub_module")
Desired Output
TIME main INFO main_module
TIME main.sub_module INFO sub_module
To solve your problem pythonic use the Logger Formatter:
For reference check the
Logging Docs
main.py
import logging
from submodule import sub_log
from submodule2 import sub_log2
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
# create file handler which logs even debug messages
fh = logging.FileHandler('test.log')
fh.setLevel(logging.DEBUG)
# create formatter and add it to the handlers
formatter = logging.Formatter('%(asctime)s %(name)s.%(module)s - %(levelname)s - %(message)s')
fh.setFormatter(formatter)
# add the handlers to the logger
logger.addHandler(fh)
sub_log("test")
sub_log2("test")
submodule.py
import logging
import __main__
def sub_log(msg):
logger = logging.getLogger(__main__.__name__)
logger.info(msg)
I've created second submodule. ( same code other name)
My Results:
2018-10-16 20:41:23,860 __main__.submodule - INFO - test
2018-10-16 20:41:23,860 __main__.submodule2 - INFO - test
I hope this will help you :)
My goal is to redirect logging messages from a certain function into a file. This function is defined in another module. I added StreamHandler to a main logger, but a message from a child_call function is not saved to tmp.log as expected.
# main.py
import logging
import os
import sys
from child_logger import child_call
logging.basicConfig(format='%(name)s:%(filename)s:%(lineno)d:%(message)s',
level=logging.INFO, stream=sys.stdout)
logger = logging.getLogger(__name__)
logger.info('here it is')
with open('tmp.log', 'w') as f:
logger_stream_handler = logging.StreamHandler(stream=f)
logger_stream_handler.setLevel(logging.INFO)
logger.addHandler(logger_stream_handler)
logger.info('I am outer')
child_call()
logger.removeHandler(logger_stream_handler)
# child_logger.py
import logging
logger = logging.getLogger(__name__)
def child_call():
logger.info('I am inner')
Here is the output:
%python logger_test.py
__main__:logger_test.py:18:here it is
__main__:logger_test.py:25:I am outer
child_logger:child_logger.py:9:I am inner
%cat tmp.log
I am outer
I am expecting to see 'I am inner' in tmp.log. As far as I understood the logging module, there is a hierarchy of Logger objects created, messages from children should propagate to a root Logger by default and the root should handle all messages. What am I missing ?
The problem is that your loggers are not correctly chained. They need to have the same root name. For example:
# main.py
logger = logging.getLogger("parent")
# child_logger.py
logger = logging.getLogger("parent.child")
Both of your log retrievals just ask for a logger with __name__, which is set to the name of the module, except for the top level, which gets "__main__". You are ending up with the equivalent of this:
# main.py
logger = logging.getLogger("__main__")
# child_logger.py
logger = logging.getLogger("child_logger")
You need to enforce a common parent logging name scheme to create the correct logger inheritance hierarchy.
I want to create a Python logging object in my main program and have logging in both my main program and in the modules it uses at the same logging level. The basic example given in logging documentation is essentially as follows:
main.py:
import logging
import mylib
def main():
logging.basicConfig(level = logging.INFO)
logging.info('Started')
mylib.do_something()
logging.info('Finished')
if __name__ == '__main__':
main()
mylib.py:
import logging
def do_something():
logging.info('Doing something')
This works fine. I am not sure, however, of how to get a Python logging object doing something similar. The following, for example, does not work:
main.py:
import logging
import mylib
def main():
verbose = True
global log
log = logging.getLogger(__name__)
if verbose:
log.setLevel(logging.INFO)
else:
log.setLevel(logging.DEBUG)
log.info('Started')
mylib.do_something()
log.info('Finished')
if __name__ == '__main__':
main()
mylib.py:
import logging
def do_something():
log.info('Doing something')
It does not work because the global log object is not recognised in the mylib.py module. How should I be doing this? My two main goals are
to have the logging level that is set in my main program propagate through to any modules used and
to be able to use log for logging, not "logging" (i.e. log.info("alert") as opposed to logging.info("alert")).
Your application should configure the logging once (with basicConfig or see logging.config) and then in each file you can have:
import logging
log = logging.getLogger(__name__)
# further down:
log.info("alert")
So you use a logger everywhere and you don't directly access the logging module in your codebase. This allows you to configure all your loggers in your logging configuration (which, as said earlier, is configured once and loaded at the beginning of your application).
You can use a different logger in each module as follows:
import logging
LOG = logging.getLogger(__name__)
# Stuff
LOG.info("A log message")