Logs are getting added to both defult and customer file appender - python

I have written a logging utility where I have a default logger (let's say L1) which is initialized from a logconfig.yaml file and when L1 is used, the logs are written to "l1_logfile.log". In addition, I have defined another logger (let's say L2) with its own appender handler such that when used L2, the log messages are written to a file "l2_logfile.log".
The L1 logger initialized using the following static configuration (from logconfig.yaml file) and is initialized by using -- logging.config.dictConfig(config).
version: 1
formatters:
json_formatter:
format: '%(asctime)s %(levelname)s [%(threadName)s] %(name)s %(filename)s:%(funcName)s %(message)s'
class: pythonjsonlogger.jsonlogger.JsonFormatter
handlers:
file_handler:
class: logging.handlers.RotatingFileHandler
formatter: json_formatter
filename: ../logs/logfile.log
maxBytes: 10485760 # 10MB
backupCount: 20
loggers:
my_module:
level: DEBUG
handlers: [file_handler]
propagate: no
root:
level: DEBUG
handlers: [file_handler]
The L1 logger and its configurator are shown below.
def setup_logging(path):
with open(path, 'rt') as f:
try:
multiprocessing_logging.install_mp_handler()
config = yaml.safe_load(f.read())
logging.config.dictConfig(config)
except Exception as e:
print('Error in Logging Configuration. Using default configs')
logging.basicConfig(filemode='w', level=DEFAULT_LOG_LEVEL)
The L2 logger is a dynamic logger configuration and also had a custom file appender.
def get_custom_logger_appender(appender_name, logger_name):
logr_name = appender_name + logger_name
logger = logging.getLogger(logr_name)
format_str = '%(asctime)s %(levelname)s [%(threadName)s] %(name)s %(filename)s:%(funcName)s %(message)s'
formatter = jsonlogger.JsonFormatter(format_str)
log_file_name = os.path.join("./logs", '{}_{}.log'.format(appender_name, "appln"))
file_handler = logging.handlers.RotatingFileHandler(log_file_name,
maxBytes=100000,
backupCount=15)
file_handler.setFormatter(formatter)
file_handler.setLevel(logging.DEBUG)
logger.addHandler(file_handler)
logger.setLevel(logging.DEBUG)
adapter = CustomAdapter(logger, {'app_context': "app_context_value"})
return adapter
This is a unit test function.
class TestSum(unittest.TestCase):
// Defualt configurator.
setup_logging("./logconfig.yaml")
def test_appender_log_file_created(self):
appender_name = "appender"
logger_name = "testlogger"
self.logger = Logger.get_custom_logger_appender(appender_name, logger_name)
self.logger.info("This is info log 1")
self.logger.info("This is info log 2")
self.logger.info("This is info log 3")
base_dir = self.logger_config.log_base_dir()
log_file = base_dir + "/" + appender_name + '_' + LoggerConfig.appl_name() + ".log"
assert os.path.exists(log_file)
The issue is that three log messages are written to both log files "l1_logfile.log" and "l2_logfile.log" whereas these log messages should only be written to "l2_logfile.log" as they are written by the custom logger appender. What is wrong with my logic?

I solved it by implementing the get_logger() using another RotatingFileHandler. The issue was that the logger config (using the .yaml file) was also getting applied to the L2 logger instance by default. The below code helped to explicitly configure the logger.
def get_logger(logger_name):
log_file_name = os.path.join("logfile.log")
file_handler = logging.handlers.RotatingFileHandler(log_file_name,
mode='a',
maxBytes=10*1024*1024,
backupCount=15)
formatter = jsonlogger.JsonFormatter('%(asctime)s %(levelname)s [%(threadName)s] %(name)s %(filename)s:%(funcName)s %(message)s')
file_handler.setFormatter(formatter)
logger = logging.getLogger(logger_name)
logger.setLevel(logging.DEBUG)
logger.addHandler(file_handler)
adapter = CustomAdapter(logger, {'app_context': "app_context"})
Logger.__logger_dict[logger_name] = adapter
Logger.update_log_level_config(logger_name, logger.level)
return adapter

Related

Separate local logger with root log python

I am using logging module in python. In my main.py file I am using two logger.
Root logger (To get logs from multiple modules in same directory)
Local logger (To log specific information)
I want information of local logger to be separate from root logger. But when I am creating separate logger. Information of local logger is also present in root logger info.
Here is the sample of how I am doing this
# main.py
import logging
def setup_logger(filename, name = ''):
if name == '':
logging.basicConfig(filename=filename,
format='%(asctime)s %(funcName)s %(levelname)s %(message)s',
filemode='a')
logger = logging.getLogger()
else:
"""
handler = logging.FileHandler(filename, mode = 'a')
handler.setFormatter(logging.Formatter('%(asctime)s %(funcName)s %(levelname)s %(message)s'))
logger = logging.getLogger(name)
logger.addHandler(handler)
"""
formatter = logging.Formatter('%(asctime)s %(levelname)s %(message)s')
handler = logging.FileHandler(filename)
handler.setFormatter(formatter)
logger = logging.getLogger(name)
logger.setLevel(logging.DEBUG)
logger.addHandler(handler)
return logger
logger.setLevel(logging.DEBUG)
return logger
logger = setup_logger('main.log')
local_logger = setup_logger('local_log.log', 'local_log')
# other file under root log
logger = logging.getLogger("__main__." + __name__)
You have to stop propagation if you don't want the local loggers to send their logs to the root loggers handlers:
logger.propagate = False
This part of the documentation explains it well: https://docs.python.org/3/howto/logging.html#logging-flow

Location of Python log file should be changed

I am using logger in my python 2.7 project on a legacy code. I want to create logs on specific location but python logging module creates the log files at the default place i.e. from where it is executed.
Is there is any way to change this default location?
Below is the initialization of the logger.
logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)
formatter = logging.Formatter('%(message)s')
file_handler = logging.FileHandler('file.log')
file_handler.setFormatter(formatter)
logger.addHandler(file_handler)
Below is an example of how you can set the location and other properties of Python logger:
You can define a get_logger function as follows:
import logging
import os
LOG_DIR = 'log_dir'
LOG_FORMATTER = logging.Formatter('[%(asctime)s] %(levelname)s %(name)s: %(message)s')
def get_logger(log_name, log_dir = LOG_DIR):
if not os.path.isdir(log_dir):
os.mkdir(log_dir)
logger = logging.getLogger(log_name)
logging.basicConfig(level = logging.INFO)
log_handler = logging.FileHandler(os.path.join(log_dir, log_name))
logger.addHandler(log_handler)
log_handler.setFormatter(LOG_FORMATTER)
log_handler.setLevel('INFO')
return logger
Then in the file that you want to make logs, you can do as follows:
logger = get_logger('filename')
If you want to make a logging message, you can then do as follows:
logger.info('logging information!')

Logging to two files with different settings

I am already using a basic logging config where all messages across all modules are stored in a single file. However, I need a more complex solution now:
Two files: the first remains the same.
The second file should have some custom format.
I have been reading the docs for the module, bu they are very complex for me at the moment. Loggers, handlers...
So, in short:
How to log to two files in Python 3, ie:
import logging
# ...
logging.file1.info('Write this to file 1')
logging.file2.info('Write this to file 2')
You can do something like this:
import logging
formatter = logging.Formatter('%(asctime)s %(levelname)s %(message)s')
def setup_logger(name, log_file, level=logging.INFO):
"""To setup as many loggers as you want"""
handler = logging.FileHandler(log_file)
handler.setFormatter(formatter)
logger = logging.getLogger(name)
logger.setLevel(level)
logger.addHandler(handler)
return logger
# first file logger
logger = setup_logger('first_logger', 'first_logfile.log')
logger.info('This is just info message')
# second file logger
super_logger = setup_logger('second_logger', 'second_logfile.log')
super_logger.error('This is an error message')
def another_method():
# using logger defined above also works here
logger.info('Inside method')
def setup_logger(logger_name, log_file, level=logging.INFO):
l = logging.getLogger(logger_name)
formatter = logging.Formatter('%(message)s')
fileHandler = logging.FileHandler(log_file, mode='w')
fileHandler.setFormatter(formatter)
streamHandler = logging.StreamHandler()
streamHandler.setFormatter(formatter)
l.setLevel(level)
l.addHandler(fileHandler)
l.addHandler(streamHandler)
setup_logger('log1', txtName+"txt")
setup_logger('log2', txtName+"small.txt")
logger_1 = logging.getLogger('log1')
logger_2 = logging.getLogger('log2')
logger_1.info('111messasage 1')
logger_2.info('222ersaror foo')

Python logging.get_Logger(name) with FileHandler does not write to file

If I get the logger with a name and add a FileHandler it does not write to the file.
This works and writes correctly to the file:
log = logging.getLogger()
fh = logging.FileHandler(logfile)
log.addHandler(fh)
fh_fmt = logging.Formatter("%(asctime)s (%(levelname)s)\t: %(message)s")
fh.setFormatter(fh_fmt)
log.setLevel(logging.INFO)
This does not write to the file:
log = logging.getLogger(name)
fh = logging.FileHandler(logfile)
log.addHandler(fh)
fh_fmt = logging.Formatter("%(asctime)s (%(levelname)s)\t: %(message)s")
fh.setFormatter(fh_fmt)
log.setLevel(logging.INFO)
The only difference is that I get a 'named' logger.
This is a rather old question, but I believe I found the underlying problem and solution, at least with a newer version of Python.
The second code example starts with log = logging.getLogger(name), where name is presumed to be a string representing the name of the logger. Since a name is provided, this log will not be the root logger. According to Logger.setLevel(level) docs for Python 3.6+,
When a logger is created, the level is set to NOTSET (which causes all messages to be processed when the logger is the root logger, or delegation to the parent when the logger is a non-root logger).
This tells us that we have to set the level of our logger so that it will actually process the messages instead of passing it to the root logger.
This is a code example I wrote (in Python 3.7) that does not work:
from pathlib import Path
import logging
formatter = logging.Formatter('%(name)s [%(levelname)s] %(message)s')
log_file_dir = Path('./log/')
config_file = 'config_file.txt'
config_file_path = log_file_dir / config_file
logger = logging.getLogger('example_logger')
fh = logging.FileHandler(config_file_path, mode='w')
fh.setLevel(logging.INFO)
fh.setFormatter(formatter)
logger.addHandler(fh)
logger.info('Start Configuration Log')
And this one works by adding one line:
from pathlib import Path
import logging
formatter = logging.Formatter('%(name)s [%(levelname)s] %(message)s')
log_file_dir = Path('./log/')
config_file = 'config_file_2.txt'
config_file_path = log_file_dir / config_file
logger = logging.getLogger('example_logger')
logger.setLevel(logging.INFO) # <------ Or the applicable level for your use-case
fh = logging.FileHandler(config_file_path, mode='w')
fh.setLevel(logging.INFO)
fh.setFormatter(formatter)
logger.addHandler(fh)
logger.info('Start Configuration Log')
Note: The first code example does create the chosen log file, but does not write 'Start Configuration Log' to the file.

Restart logging to a new file (Python)

I'm using the following code to initialize logging in my application:
logger = logging.getLogger()
logger.setLevel(logging.DEBUG)
# log to a file
directory = '/reserved/DYPE/logfiles'
now = datetime.now().strftime("%Y%m%d_%H%M%S")
filename = os.path.join(directory, 'dype_%s.log' % now)
file_handler = logging.FileHandler(filename)
file_handler.setLevel(logging.DEBUG)
formatter = logging.Formatter("%(asctime)s %(filename)s, %(lineno)d, %(funcName)s: %(message)s")
file_handler.setFormatter(formatter)
logger.addHandler(file_handler)
# log to the console
console_handler = logging.StreamHandler()
level = logging.INFO
console_handler.setLevel(level)
logger.addHandler(console_handler)
logging.debug('logging initialized')
How can I close the current logging file and restart logging to a new file?
Note: I don't want to use RotatingFileHandler, because I want full control over all the filenames.
You can manually re-assign the handler if you want using the removeHandler and addHandler OR, you can access logger.handlers[index_of_handler_here].stream and replace the stream manually, but I'd recommend the former over the latter.
logger.handlers[0].stream.close()
logger.removeHandler(logger.handlers[0])
file_handler = logging.FileHandler(filename)
file_handler.setLevel(logging.DEBUG)
formatter = logging.Formatter("%(asctime)s %(filename)s, %(lineno)d, %(funcName)s: %(message)s")
file_handler.setFormatter(formatter)
logger.addHandler(file_handler)
Here is what I do:
def initLogging(filename):
global logger
if logger == None:
logger = logging.getLogger()
else: # wish there was a logger.close()
for handler in logger.handlers[:]: # make a copy of the list
logger.removeHandler(handler)
logger.setLevel(logging.DEBUG)
formatter = logging.Formatter(fmt='%(asctime)s: %(message)s', datefmt='%I:%M:%S')
fh = logging.FileHandler(filename)
fh.setFormatter(formatter)
logger.addHandler(fh)
sh = logging.StreamHandler(sys.stdout)
sh.setFormatter(formatter)
logger.addHandler(sh)

Categories