I had created a custom logger for my purpose using python and made it a utility. I created its context-based and had it created with custom handlers for different scenarios. I am trying to make my custom logger visible across all modules. But I am not able to this. I don't want to reuse these lines in each of my modules just for the logger and pass on my context and config just for that.
logger = myLogger(config, context) # config has data for context based custom handling
In my main module, I just made the logger object global so that other methods can use the logger without any further add ons. Is there any way I can do the same across modules.
In many similar queries. what is suggested is
logger = logging.getLogger(__name__)
But this does not pass on my custom handlers also.
Can someone please advise how I can achieve this.
Make my custom logger global for my whole run time so that I don't have to declare whenever I have to.
my code is like this
def main():
args=argparse.ArgumentParser
parser.add_argument('-context','--context')
parser.add_argument('-cfg','--cfg')
config=configparser.ConfigParser()
config.read(cfg)
global logger
logger=myLogger(config,context)
## here context is my section name from config. which has details for my current process.
##my myLogger reads from a log file config details in configparser onject
## there i will remove my detault handlers and add my custom handlers and return the logger object back to main so this is how it works
## making logger as global in main makes it visible to other methods in same module as main
## but i am trying to make my logger visible to other modules also if i call the methods from those module
if __name__==__main__:
main()
My way of doing it:
Logging.py
import logging
# Your custom stuff
logger = myLogger(config, context)
Every_other_file.py
from Logging import logger
Edit: Change the config later on
Logging.py
import logging
# Your custom stuff
logger = myLogger(config, context)
def change_config(config):
global logger
logger = myLogger(config, context)
def set_logger(config)
global logger
logger = myLogger(config.context)
** main.py**
config = something
Logging.set_logger(config)
Something like that. My point is you can call a method with it you can change the value
Related
We are using alembic to apply DB revision. I have configured the connection and it works as expected, except I am not able to make it use our customer logger.
We have our own logger class (derived from Python logging) which is used throughout the application, and I want alembic to use it instead of the default.
Is there any way I can pass a logger object of our class to it? I want it to print its own log using the format and handler that is defined in the custom logger.
I tried,
env file
from sqlalchemy import engine_from_config
from sqlalchemy import pool
from alembic import context
from tools.logger import Logger
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
if config.attributes.get('configure_logger', True):
fileConfig(config.config_file_name)
logger = Logger('alembic.env')
my script
self.alembic_cfg = alembic_config(self.alembic_ini_path, attributes={'configure_logger': False})
I also tried,
self.alembic_cfg.set_section_option("logger", "keys", "root")
Both of the above methods just disable its own logs.
To my knowledge, it is not possible to replace one logger with another. Is it something that you really need though?
I want it to print its own log using the format and handler that is defined in the custom logger.
As I understand it, logger has handlers and handlers have formatters. If you have a handler with a formatter you could just edit alembic.ini and assign your handler to alembic logger.
add formatter to formatters in ini file
[formatters] # existing section
keys = generic,pyraider # just add the name of your formatter
define your custom formatter
[formatter_pyraider]
class=tools.logger.PyraiderFormatter
add handler to handlers in ini file
[handlers] # existing section
keys = console,pyraider # just add the name of your handler
define your custom handler
[handler_pyraider] # new section, handler_<your_name>
class = tools.logger.PyraiderHandler
args = (sys.stderr,) # might need to play around with this one
level = INFO
formatter = pyraider
assign handler to alembic logger
[logger_alembic] # existing section, what you want
level = INFO
handlers = pyraider # <---- add your handler, defined previously, here
qualname = alembic
Docs on alembic.ini file.
https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
You might need to tweak some things but it should work as that's basically how python logging module works.
More info on how to structure you ini file for logging module
Official Python Docs
Hitchhiker's Guide
I have been trying to create a new class of Logger by subclassing logging.Logger . Python version is 3.5
I have several modules in my application and I configure the logging only in the main module where I set the logger class using logging.setLoggerClass(...)
However when I retrieve the same Logger instance from some other module, it still creates a new instance of the Logger class and not the child class instance that I defined.
For example my code is :
# module 1
import logging
class MyLoggerClass(logging.getLoggerClass()):
def __init__(name):
super(MyLoggerClass, self).__init__(name)
def new_logger_method(...):
# some new functionality
if __name__ == "__main__":
logging.setLoggerClass(MyLoggerClass)
mylogger = logging.getLogger("mylogger")
# configuration of mylogger instance
# module 2
import logging
applogger = logging.getLogger("mylogger")
print(type(applogger))
def some_function():
applogger.debug("in module 2 some_function")
When this code is executed, I expect the applogger in module 2 to be of type MyLoggerClass. I intend to use the new_logger_method for some new functionality.
However since applogger is turning out to be of type logging.Logger, when the code is run it throws Logger has no attribute named new_logger_method.
Has anyone ever faced this issue?
Thanks in advance for any help!
Pranav
Instead of attempting to affect the global logger by changing the default logger factory, if you want your module to play nicely with any environment you should define a logger just for your module (and its children) and use it as a main logger for everything else deeper in your module structure. The trouble is that you explicitly want to use a different logging.Logger class than the default/globally defined one and the logging module doesn't provide an easy way to do context-based factory switching so you'll have to do it yourself.
There are many ways to do that but my personal preference is to be as explicit as possible and define your own logger module which you'll then import in your other modules in your package whenever you need to obtain a custom logger. In your case, you can create logger.py at the root of your package and do something like:
import logging
class CustomLogger(logging.Logger):
def __init__(self, name):
super(CustomLogger, self).__init__(name)
def new_logger_method(self, caller=None):
self.info("new_logger_method() called from: {}.".format(caller))
def getLogger(name=None, custom_logger=True):
if not custom_logger:
return logging.getLogger(name)
logging_class = logging.getLoggerClass() # store the current logger factory for later
logging._acquireLock() # use the global logging lock for thread safety
try:
logging.setLoggerClass(CustomLogger) # temporarily change the logger factory
logger = logging.getLogger(name)
logging.setLoggerClass(logging_class) # be nice, revert the logger factory change
return logger
finally:
logging._releaseLock()
Feel free to include other custom log initialization logic in it if you so desire. Then from your other modules (and sub-packages) you can import this logger and use its getLogger() to obtain a local, custom logger. For example, all you need in module1.py is:
from . import logger # or `from package import logger` for external/non-relative use
log = logger.getLogger(__name__) # obtain a main logger for this module
def test(): # lets define a function we can later call for testing
log.new_logger_method("Module 1")
This covers the internal use - as long as you stick to this pattern in all your modules/sub-modules you'll have the access to your custom logger.
When it comes to external use, you can write an easy test to show you that your custom logger gets created and that it doesn't interfere with the rest of the logging system therefore your package/module can be declared a good citizen. Under the assumption that your module1.py is in a package called package and you want to test it as a whole from the outside:
import logging # NOTE: we're importing the global, standard `logging` module
import package.module1
logging.basicConfig() # initialize the most rudimentary root logger
root_logger = logging.getLogger() # obtain the root logger
root_logger.setLevel(logging.DEBUG) # set root log level to DEBUG
# lets see the difference in Logger types:
print(root_logger.__class__) # <class 'logging.RootLogger'>
print(package.module1.log.__class__) # <class 'package.logger.CustomLogger'>
# you can also obtain the logger by name to make sure it's in the hierarchy
# NOTE: we'll be getting it from the standard logging module so outsiders need
# not to know that we manage our logging internally
print(logging.getLogger("package.module1").__class__) # <class 'package.logger.CustomLogger'>
# and we can test that it indeed has the custom method:
logging.getLogger("package.module1").new_logger_method("root!")
# INFO:package.module1:new_logger_method() called from: root!.
package.module1.test() # lets call the test method within the module
# INFO:package.module1:new_logger_method() called from: Module 1.
# however, this will not affect anything outside of your package/module, e.g.:
test_logger = logging.getLogger("test_logger")
print(test_logger.__class__) # <class 'logging.Logger'>
test_logger.info("I am a test logger!")
# INFO:test_logger:I am a test logger!
test_logger.new_logger_method("root - test")
# AttributeError: 'Logger' object has no attribute 'new_logger_method'
I have python project with multiple modules with logging. I perform initialization (reading log configuration file and creating root logger and enable/disable logging) in every module before start of logging the messages. Is it possible to perform this initialization only once in one place (like in one class may be called as Log) such that the same settings are reused by logging all over the project?
I am looking for a proper solution to have only once to read the configuration file and to only once get and configure a logger, in a class constructor, or perhaps in the initializer (__init__.py). I don't want to do this at client side (in __main__ ). I want to do this configuration only once in separate class and call this class in other modules when logging is required.
setup using #singleton pattern
#log.py
import logging.config
import yaml
from singleton_decorator import singleton
#singleton
class Log:
def __init__(self):
configFile = 'path_to_my_lof_config_file'/logging.yaml
with open(configFile) as f:
config_dict = yaml.load(f)
logging.config.dictConfig(config_dict)
self.logger = logging.getLogger('root')
def info(self, message):
self.logger.info(message)
#module1.py
from Log import Log
myLog = Log()
myLog.info('Message logged successfully)
#module2.py
from Log import Log
myLog = Log() #config read only once and only one object is created
myLog.info('Message logged successfully)
From the documentation,
Note that Loggers should NEVER be instantiated directly, but always through the module-level function logging.getLogger(name). Multiple calls to getLogger() with the same name will always return a reference to the same Logger object.
You can initialize and configure logging in your main entry point. See Logging from multiple modules in this Howto (Python 2.7).
I had the same problem and I don't have any classes or anything, so I solved it with just using global variable
utils.py:
existing_loggers = {}
def get_logger(name='my_logger', level=logging.INFO):
if name in existing_loggers:
return existing_loggers[name]
# Do the rest of initialization, handlers, formatters etc...
I was wondering what the standard set up is for performing logging from within a Python app.
I am using the Logging class, and I've written my own logger class that instantiates the Logging class. My main then instantiates my logger wrapper class. However, my main instantiates other classes and I want those other classes to also be able to write to he log file via the logger object in the main.
How do I make that logger object such that it can be called by other classes? It's almost like we need some sort of static logger object to get this to work.
I guess the long and short of the question is: how do you implement logging within your code structure such that all classes instantiated from within main can write to the same log file? Do I just have to create a new logging object in each of the classes that points to the same file?
I don't know what you mean by the Logging class - there's no such class in Python's built-in logging. You don't really need wrappers: here's an example of how to do logging from arbitrary classes that you write:
import logging
# This class could be imported from a utility module
class LogMixin(object):
#property
def logger(self):
name = '.'.join([__name__, self.__class__.__name__])
return logging.getLogger(name)
# This class is just there to show that you can use a mixin like LogMixin
class Base(object):
pass
# This could be in a module separate from B
class A(Base, LogMixin):
def __init__(self):
# Example of logging from a method in one of your classes
self.logger.debug('Hello from A')
# This could be in a module separate from A
class B(Base, LogMixin):
def __init__(self):
# Another example of logging from a method in one of your classes
self.logger.debug('Hello from B')
def main():
# Do some work to exercise logging
a = A()
b = B()
with open('myapp.log') as f:
print('Log file contents:')
print(f.read())
if __name__ == '__main__':
# Configure only in your main program clause
logging.basicConfig(level=logging.DEBUG,
filename='myapp.log', filemode='w',
format='%(name)s %(levelname)s %(message)s')
main()
Generally it's not necessary to have loggers at class level: in Python, unlike say Java, the unit of program (de)composition is the module. However, nothing stops you from doing it, as I've shown above. The script, when run, displays:
Log file contents:
__main__.A DEBUG Hello from A
__main__.B DEBUG Hello from B
Note that code from both classes logged to the same file, myapp.log. This would have worked even with A and B in different modules.
Try using logging.getLogger() to get your logging object instance:
http://docs.python.org/3/library/logging.html#logging.getLogger
All calls to this function with a given name return the same logger instance. This means that logger instances never need to be passed between different parts of an application.
UPDATE:
The recommended way to do this is to use the getLogger() function and configure it (setting a handler, formatter, etc...):
# main.py
import logging
import lib
def main():
logger = logging.getLogger('custom_logger')
logger.setLevel(logging.INFO)
logger.addHandler(logging.FileHandler('test.log'))
logger.info('logged from main module')
lib.log()
if __name__ == '__main__':
main()
# lib.py
import logging
def log():
logger = logging.getLogger('custom_logger')
logger.info('logged from lib module')
If you really need to extend the logger class take a look at logging.setLoggerClass(klass)
UPDATE 2:
Example on how to add a custom logging level without changing the Logging class:
# main.py
import logging
import lib
# Extend Logger class
CUSTOM_LEVEL_NUM = 9
logging.addLevelName(CUSTOM_LEVEL_NUM, 'CUSTOM')
def custom(self, msg, *args, **kwargs):
self._log(CUSTOM_LEVEL_NUM, msg, args, **kwargs)
logging.Logger.custom = custom
# Do global logger instance setup
logger = logging.getLogger('custom_logger')
logger.setLevel(logging.INFO)
logger.addHandler(logging.FileHandler('test.log'))
def main():
logger = logging.getLogger('custom_logger')
logger.custom('logged from main module')
lib.log()
if __name__ == '__main__':
main()
Note that adding custom level is not recommended: http://docs.python.org/2/howto/logging.html#custom-levels
Defining a custom handler and maybe using more than one logger may do the trick for your other requirement: optional output to stderr.
I'm using a open-source Python library in my project. This library logs a lot of information using the logging class.
...but I can't see the output or log it to file. I know that i would have to create a logger instance and add a file-handler or a console-handler to it but how can i pass this logger instance to the class? Here's the init snippet of the class that I'm going to be using.
class Periscope:
''' Main Periscope class'''
def __init__(self):
self.config = ConfigParser.SafeConfigParser({"lang": "en"})
if is_local:
self.config_file = os.path.join(bd.xdg_config_home, "periscope", "config")
if not os.path.exists(self.config_file):
folder = os.path.dirname(self.config_file)
if not os.path.exists(folder):
logging.info("Creating folder %s" %folder)
os.mkdir(folder)
logging.info("Creating config file")
configfile = open(self.config_file, "w")
self.config.write(configfile)
configfile.close()
else:
#Load it
self.config.read(self.config_file)
self.pluginNames = self.listExistingPlugins()
self._preferedLanguages = None
Any help?
Thanks guys.
Simplest way will be to use basicConfig function in logging module. Here's what docs are saying:
Does basic configuration for the logging system by creating a StreamHandler with a default Formatter and adding it to the root logger. The function does nothing if any handlers have been defined for the root logger. The functions debug(), info(), warning(), error() and critical() will call basicConfig() automatically if no handlers are defined for the root logger.
This function does nothing if the root logger already has handlers configured.
logging module is designed in a way that configuration is separated from creating log messages, so there's no need of having access to logger instance.
Try setting the level to the lowest possible (DEBUG). This enables all log levels and should give you all logging messages. The simplest way to do default configuration is to use basicConfig()
import logging
logging.basicConfig(level=logging.DEBUG, filename='/path/to/mylog.log')
If the library you are using doesn't override the logging configuration this should be enough to get messages into the log file. If you know the name of the logger the library is using, you can set the level for the library specifically:
logging.getLogger("periscope").setLevel(logging.DEBUG)