I have a custom logging class, which has the following format:
log_format = "%(asctime)s.%(msecs)d %(levelname)-8s [%(processName)s] [%(threadName)s] %(filename)s:%(lineno)d --- %(message)s"
My project tree looks something like this:
.
├── exceptions.py
├── logger.py
├── settings.py
└── main.py
In main.py I import my custom Logger from logger.py. On several places I perform logging using the following syntax:
Logger.info("Transcribed audio successfully.")
However when looking at the logs, the filename and lineno params are always referring to my Logger class, not the actual function from main.py which invoked the logging:
2023-02-15 10:48:06,241.241 INFO [MainProcess] [MainThread] logger.py:38 --- Transcribed audio successfully.
Is there a way to change this? I would like that the log entry states something like:
2023-02-15 10:48:06,241.241 INFO [MainProcess] [MainThread] main.py:98 --- Transcribed audio successfully.
This is my logger.py file:
import logging
from logging.handlers import RotatingFileHandler
class Logger:
log_format = "%(asctime)s.%(msecs)d %(levelname)-8s [%(processName)s] [%(threadName)s] %(filename)s:%(lineno)d --- %(message)s"
#staticmethod
def setup_single_logger(name, logfile, level):
handler = RotatingFileHandler(logfile, mode='a', maxBytes=1024 * 1024, backupCount=10)
handler.setFormatter(logging.Formatter(Logger.log_format))
logger = logging.getLogger(name)
logger.setLevel(level)
logger.addHandler(handler)
return logger
#staticmethod
def setup_logging():
Logger.info_logger = Logger.setup_single_logger('INFO', '/path/to/your/logfile.log', logging.INFO)
#staticmethod
def info(msg, *args, **kwargs):
Logger.info_logger.info(msg, *args, **kwargs)
Logger.setup_logging()
And an example main.py is:
from logger import Logger
Logger.info("Transcribed audio successfully.")
The problem is that you setup and create your logger in logging.py. So when you call Logger.info from main, the actual call of info on the logger object is done inside logger.py. You just created an interface for yourself to the logger object, but the underlying call is still the one specifying the attributes of the message.
What you could do is leave only the setup in logger.py, and let main.py have its own logger object.
So log.py (changed name due to clashes) can be:
import logging
from logging.handlers import RotatingFileHandler
def basic_config():
log_format = "%(asctime)s.%(msecs)d %(levelname)-8s [%(processName)s] [%(threadName)s] %(filename)s:%(lineno)d --- %(message)s"
handler = RotatingFileHandler('logfile.log', mode='a', maxBytes=1024 * 1024, backupCount=10)
handler.setFormatter(logging.Formatter(log_format))
logging.basicConfig(handlers=(handler,), level=logging.INFO)
(I felt like the use of the class was not necessary)
And main.py:
import log
import logging
log.basic_config()
logger = logging.getLogger(__file__)
logger.info("test")
Now the log file I got from running once with your code and once with mine was:
2023-02-15 13:14:30,275.275 INFO [MainProcess] [MainThread] log.py:23 --- test
2023-02-15 13:18:51,358.358 INFO [MainProcess] [MainThread] main.py:7 --- test
I have a logging function with hardcoded logfile name (LOG_FILE):
setup_logger.py
import logging
import sys
FORMATTER = logging.Formatter("%(levelname)s - %(asctime)s - %(name)s - %(message)s")
LOG_FILE = "my_app.log"
def get_console_handler():
console_handler = logging.StreamHandler(sys.stdout)
console_handler.setFormatter(FORMATTER)
return console_handler
def get_file_handler():
file_handler = logging.FileHandler(LOG_FILE)
file_handler.setFormatter(FORMATTER)
return file_handler
def get_logger(logger_name):
logger = logging.getLogger(logger_name)
logger.setLevel(logging.DEBUG) # better to have too much log than not enough
logger.addHandler(get_console_handler())
logger.addHandler(get_file_handler())
# with this pattern, it's rarely necessary to propagate the error up to parent
logger.propagate = False
return logger
I use this in various modules this way:
main.py
from _Core import setup_logger as log
def main(incoming_feed_id: int, type: str) -> None:
logger = log.get_logger(__name__)
...rest of my code
database.py
from _Core import setup_logger as log
logger = log.get_logger(__name__)
Class Database:
...rest of my code
etl.py
import _Core.database as db
from _Core import setup_logger as log
logger = log.get_logger(__name__)
Class ETL:
...rest of my code
What I want to achieve is to always change the logfile's path and name on each run based on arguments passed to the main() function in main.py.
Simplified example:
If main() receives the following arguments: incoming_feed_id = 1, type = simple_load, the logfile's name should be 1simple_load.log.
I am not sure what is the best practice for this. What I came up with is probably the worst thing to do: Add a log_file parameter to the get_logger() function in setup_logger.py, so I can add a filename in main() in main.py. But in this case I would need to pass the parameters from main to the other modules as well, which I do not think I should do as for example the database class is not even used in main.py.
I don't know enough about your application to be sure this'll work for you, but you can just configure the root logger in main() by calling get_logger('', filename_based_on_cmdline_args), and stuff logged to the other loggers will be passed to the root logger's handlers for processing if the logger levels configured allow it. The way you're doing it now seems to open multiple handlers pointing to the same file, which seems sub-optimal. The other modules can just use logging.getLogger(__name__) rather than log.get_logger(__name__).
I want to create 2 loggers which logs to 2 different outputs in Python. This logger of mine is configured in a single module which will be used by 2 other main modules. My problem is since I configured the root logger to allow my logger to be used by 2 different main modules, I cannot separate logging output.
How can this be done?
Here is how I configure my logging:
# logger.py
import logging
def setup_logging():
# If I give name to my getLogger, it will not be configuring root logger and my changes here cannot cascade to all other child loggers.
logger = logging.getLogger()
streamHandler = logging.StreamHandler()
streamFormat = logging.Formatter('%(name)s - %(message)s')
streamHandler.setFormatter(streamFormat)
logger.addHandler(streamHandler)
# main1.py
import logging
from logger import setup_logging
from submodule import log_me
setup_logging()
logger = logging.getLogger('main1')
logger.log('I am from main1')
log_me()
# main2.py
import logging
from logger import setup_logging
from submodule import log_me
setup_logging()
logger = logging.getLogger('main2')
logger.log('I am from main2')
log_me()
# submodule.py
import logging
logger = logging.getLogger('submodule')
def log_me():
logger.info('I am from submodule')
Result from main1:
main1 - I am from main1
submodule - I am from submodule
Result from main2:
main2 - I am from main2
submodule - I am from submodule
Here is what I am trying to achieve (but beautifully fails of course).
# logger.py
import logging
def setup_logging():
logger = logging.getLogger()
streamHandler = logging.StreamHandler()
streamFormat = logging.Formatter('%(name)s - %(message)s')
streamHandler.setFormatter(streamFormat)
logger.addHandler(streamHandler)
def setup_second_logging():
logger2 = logging.getLogger()
fileHandler = logging.FileHandler('./out.log)
fileFormat = logging.Formatter('%(name)s - %(message)s')
fileHandler.setFormatter(fileFormat)
logger.addHandler(fileHandler)
--same main1.py--
--same main2.py--
# submodule.py
import logging
from logger import setup_second_logging
setup_second_logging()
logger = logging.getLogger('submodule')
def log_me():
logger.info('I am from submodule')
Result from main1:
main1 - I am from main1
# no submodule since it is logged to file
Result from main2:
main2 - I am from main2
# no submodule since it is logged to file
You can try to use hierarchy for the loggers. For example:
logger = logging.getLogger(__name__)
And in this case you can set specific handlers for the submodules in logger.py.
I have multiple python modules that I'd like to use the same logger while preserving the call hierarchy in those logs. I'd also like to do this with a logger whose name is the name of the calling module (or calling module stack). I haven't been able to work out how to get the name of the calling module except with messing with the stack trace, but that doesn't feel very pythonic.
Is this possible?
main.py
import logging
from sub_module import sub_log
logger = logging.getLogger(__name__)
logger.info("main_module")
sub_log()
sub_module.py
import logging
def sub_log():
logger = logging.getLogger(???)
logger.info("sub_module")
Desired Output
TIME main INFO main_module
TIME main.sub_module INFO sub_module
To solve your problem pythonic use the Logger Formatter:
For reference check the
Logging Docs
main.py
import logging
from submodule import sub_log
from submodule2 import sub_log2
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
# create file handler which logs even debug messages
fh = logging.FileHandler('test.log')
fh.setLevel(logging.DEBUG)
# create formatter and add it to the handlers
formatter = logging.Formatter('%(asctime)s %(name)s.%(module)s - %(levelname)s - %(message)s')
fh.setFormatter(formatter)
# add the handlers to the logger
logger.addHandler(fh)
sub_log("test")
sub_log2("test")
submodule.py
import logging
import __main__
def sub_log(msg):
logger = logging.getLogger(__main__.__name__)
logger.info(msg)
I've created second submodule. ( same code other name)
My Results:
2018-10-16 20:41:23,860 __main__.submodule - INFO - test
2018-10-16 20:41:23,860 __main__.submodule2 - INFO - test
I hope this will help you :)
I was wondering how to implement a global logger that could be used everywhere with your own settings:
I currently have a custom logger class:
class customLogger(logging.Logger):
...
The class is in a separate file with some formatters and other stuff.
The logger works perfectly on its own.
I import this module in my main python file and create an object like this:
self.log = logModule.customLogger(arguments)
But obviously, I cannot access this object from other parts of my code.
Am i using a wrong approach? Is there a better way to do this?
Use logging.getLogger(name) to create a named global logger.
main.py
import log
logger = log.setup_custom_logger('root')
logger.debug('main message')
import submodule
log.py
import logging
def setup_custom_logger(name):
formatter = logging.Formatter(fmt='%(asctime)s - %(levelname)s - %(module)s - %(message)s')
handler = logging.StreamHandler()
handler.setFormatter(formatter)
logger = logging.getLogger(name)
logger.setLevel(logging.DEBUG)
logger.addHandler(handler)
return logger
submodule.py
import logging
logger = logging.getLogger('root')
logger.debug('submodule message')
Output
2011-10-01 20:08:40,049 - DEBUG - main - main message
2011-10-01 20:08:40,050 - DEBUG - submodule - submodule message
Since I haven't found a satisfactory answer, I would like to elaborate on the answer to the question a little bit in order to give some insight into the workings and intents of the logging library, that comes with Python's standard library.
In contrast to the approach of the OP (original poster) the library clearly separates the interface to the logger and configuration of the logger itself.
The configuration of handlers is the prerogative of the application developer who uses your library.
That means you should not create a custom logger class and configure the logger inside that class by adding any configuration or whatsoever.
The logging library introduces four components: loggers, handlers, filters, and formatters.
Loggers expose the interface that application code directly uses.
Handlers send the log records (created by loggers) to the appropriate destination.
Filters provide a finer grained facility for determining which log records to output.
Formatters specify the layout of log records in the final output.
A common project structure looks like this:
Project/
|-- .../
| |-- ...
|
|-- project/
| |-- package/
| | |-- __init__.py
| | |-- module.py
| |
| |-- __init__.py
| |-- project.py
|
|-- ...
|-- ...
Inside your code (like in module.py) you refer to the logger instance of your module to log the events at their specific levels.
A good convention to use when naming loggers is to use a module-level logger, in each module which uses logging, named as follows:
logger = logging.getLogger(__name__)
The special variable __name__ refers to your module's name and looks something like project.package.module depending on your application's code structure.
module.py (and any other class) could essentially look like this:
import logging
...
log = logging.getLogger(__name__)
class ModuleClass:
def do_something(self):
log.debug('do_something() has been called!')
The logger in each module will propagate any event to the parent logger which in return passes the information to its attached handler! Analogously to the python package/module structure, the parent logger is determined by the namespace using "dotted module names". That's why it makes sense to initialize the logger with the special __name__ variable (in the example above name matches the string "project.package.module").
There are two options to configure the logger globally:
Instantiate a logger in project.py with the name __package__ which equals "project" in this example and is therefore the parent logger of the loggers of all submodules. It is only necessary to add an appropriate handler and formatter to this logger.
Set up a logger with a handler and formatter in the executing script (like main.py) with the name of the topmost package.
When developing a library which uses logging, you should take care to document how the library uses logging - for example, the names of loggers used.
The executing script, like main.py for example, might finally look something like this:
import logging
from project import App
def setup_logger():
# create logger
logger = logging.getLogger('project')
logger.setLevel(logging.DEBUG)
# create console handler and set level to debug
ch = logging.StreamHandler()
ch.setLevel(level)
# create formatter
formatter = logging.Formatter('%(asctime)s [%(levelname)s] %(name)s: %(message)s')
# add formatter to ch
ch.setFormatter(formatter)
# add ch to logger
logger.addHandler(ch)
if __name__ == '__main__' and __package__ is None:
setup_logger()
app = App()
app.do_some_funny_stuff()
The method call log.setLevel(...) specifies the lowest-severity log message a logger will handle but not necessarily output! It simply means the message is passed to the handler as long as the message's severity level is higher than (or equal to) the one that is set. But the handler is responsible for handling the log message (by printing or storing it for example).
Hence the logging library offers a structured and modular approach which just needs to be exploited according to one's needs.
Logging documentation
Create an instance of customLogger in your log module and use it as a singleton - just use the imported instance, rather than the class.
The python logging module is already good enough as global logger, you might simply looking for this:
main.py
import logging
logging.basicConfig(level = logging.DEBUG,format = '[%(asctime)s] %(levelname)s [%(name)s:%(lineno)s] %(message)s')
Put the codes above into your executing script, then you can use this logger with the same configs anywhere in your projects:
module.py
import logging
logger = logging.getLogger(__name__)
logger.info('hello world!')
For more complicated configs you may use a config file logging.conf with logging
logging.config.fileConfig("logging.conf")
You can just pass it a string with a common sub-string before the first period. The parts of the string separated by the period (".") can be used for different classes / modules / files / etc. Like so (specifically the logger = logging.getLogger(loggerName) part):
def getLogger(name, logdir=LOGDIR_DEFAULT, level=logging.DEBUG, logformat=FORMAT):
base = os.path.basename(__file__)
loggerName = "%s.%s" % (base, name)
logFileName = os.path.join(logdir, "%s.log" % loggerName)
logger = logging.getLogger(loggerName)
logger.setLevel(level)
i = 0
while os.path.exists(logFileName) and not os.access(logFileName, os.R_OK | os.W_OK):
i += 1
logFileName = "%s.%s.log" % (logFileName.replace(".log", ""), str(i).zfill((len(str(i)) + 1)))
try:
#fh = logging.FileHandler(logFileName)
fh = RotatingFileHandler(filename=logFileName, mode="a", maxBytes=1310720, backupCount=50)
except IOError, exc:
errOut = "Unable to create/open log file \"%s\"." % logFileName
if exc.errno is 13: # Permission denied exception
errOut = "ERROR ** Permission Denied ** - %s" % errOut
elif exc.errno is 2: # No such directory
errOut = "ERROR ** No such directory \"%s\"** - %s" % (os.path.split(logFileName)[0], errOut)
elif exc.errno is 24: # Too many open files
errOut = "ERROR ** Too many open files ** - Check open file descriptors in /proc/<PID>/fd/ (PID: %s)" % os.getpid()
else:
errOut = "Unhandled Exception ** %s ** - %s" % (str(exc), errOut)
raise LogException(errOut)
else:
formatter = logging.Formatter(logformat)
fh.setLevel(level)
fh.setFormatter(formatter)
logger.addHandler(fh)
return logger
class MainThread:
def __init__(self, cfgdefaults, configdir, pidfile, logdir, test=False):
self.logdir = logdir
logLevel = logging.DEBUG
logPrefix = "MainThread_TEST" if self.test else "MainThread"
try:
self.logger = getLogger(logPrefix, self.logdir, logLevel, FORMAT)
except LogException, exc:
sys.stderr.write("%s\n" % exc)
sys.stderr.flush()
os._exit(0)
else:
self.logger.debug("-------------------- MainThread created. Starting __init__() --------------------")
def run(self):
self.logger.debug("Initializing ReportThreads..")
for (group, cfg) in self.config.items():
self.logger.debug(" ------------------------------ GROUP '%s' CONFIG ------------------------------ " % group)
for k2, v2 in cfg.items():
self.logger.debug("%s <==> %s: %s" % (group, k2, v2))
try:
rt = ReportThread(self, group, cfg, self.logdir, self.test)
except LogException, exc:
sys.stderr.write("%s\n" % exc)
sys.stderr.flush()
self.logger.exception("Exception when creating ReportThread (%s)" % group)
logging.shutdown()
os._exit(1)
else:
self.threads.append(rt)
self.logger.debug("Threads initialized.. \"%s\"" % ", ".join([t.name for t in self.threads]))
for t in self.threads:
t.Start()
if not self.test:
self.loop()
class ReportThread:
def __init__(self, mainThread, name, config, logdir, test):
self.mainThread = mainThread
self.name = name
logLevel = logging.DEBUG
self.logger = getLogger("MainThread%s.ReportThread_%s" % ("_TEST" if self.test else "", self.name), logdir, logLevel, FORMAT)
self.logger.info("init database...")
self.initDB()
# etc....
if __name__ == "__main__":
# .....
MainThread(cfgdefaults=options.cfgdefaults, configdir=options.configdir, pidfile=options.pidfile, logdir=options.logdir, test=options.test)