I would like to have:
a main.log file with all logs above DEBUG level to be captured from main and imported modules
the console should show only ERROR level logs from main and its imported submodules.
Note: I may have no control on the error handling logs of the imported submodules.
Here is the main.py code for this:
# main.py importing a submodule
import logging
import submodule
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
# log to console
c_handler = logging.StreamHandler()
console_format = logging.Formatter("[%(levelname)s] %(message)s")
c_handler.setFormatter(console_format)
c_handler.setLevel = logging.INFO
logger.addHandler(c_handler)
logger.error("This is an error!!! Logged to console")
# log to file from main
logfile = "./logging/main.log"
f_handler = logging.FileHandler(filename=logfile)
f_format = logging.Formatter("%(asctime)s: %(name)-18s [%(levelname)-8s] %(message)s")
f_handler.setFormatter(f_format)
f_handler.setLevel = logging.DEBUG
logger.addHandler(f_handler)
logger.debug("This is a debug error. Not logged to console, but should log to file")
... and the submodule.py code ...
# submodule.py
import logging
logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)
formatter = logging.Formatter('%(levelname)s:%(name)s:%(message)s')
# log to console
c_handler = logging.StreamHandler()
c_handler.setFormatter(formatter)
logger.addHandler(c_handler)
logger.info("This is an info message from submodule, should be recorded in main.log!")
logger.debug("This is a debug message from submodule, also should be recorded in main.log!!")
When I run main.py:
[ERROR] This is an error!!! Logged to console shows up correctly in the console
But...
Console also shows...
INFO:submodule:This is an info message from submodule, should be recorded in main.log!
[DEBUG] This is a debug error. Not logged to console, but should log to file
The main.log file only shows yy-mm-dd hh:mm:ss: __main__ [DEBUG ] This is a debug error. Not logged to console, but should log to file only. It does not show logs from the submodule.py
Appreciate knowing:
Where am I going wrong?
What would be the code correction needed?
EDIT: Based on #Dan D. suggestion changed submodule.py as follows:
# submodule.py
import logging
logger = logging.getLogger(__name__)
def logsomething():
logger.info("This is an info message from submodule, should be recorded in main.log!")
logger.debug("This is a debug message from submodule, also should be recorded in main.log!!")
... and the program logs to console and file appropriately.
Q. If I want to change to message format for the submodule.py only, can this be done through main.py?
Your submodule should just be:
import logging
logger = logging.getLogger(__name__)
logger.info("This is an info message from submodule, should be recorded in main.log!")
logger.debug("This is a debug message from submodule, also should be recorded in main.log!!")
Then your main module should be:
# main.py importing a submodule
import logging
logger = logging.getLogger(__name__)
# log to console
c_handler = logging.StreamHandler()
console_format = logging.Formatter("[%(levelname)s] %(message)s")
c_handler.setFormatter(console_format)
c_handler.setLevel(logging.INFO)
logging.getLogger().addHandler(c_handler)
# log to file from main
logfile = "./logging/main.log"
f_handler = logging.FileHandler(filename=logfile)
f_format = logging.Formatter("%(asctime)s: %(name)-18s [%(levelname)-8s] %(message)s")
f_handler.setFormatter(f_format)
f_handler.setLevel(logging.DEBUG)
logging.getLogger().addHandler(f_handler)
logging.getLogger().setLevel(logging.DEBUG)
import submodule
logger.error("This is an error!!! Logged to console")
logger.debug("This is a debug error. Not logged to console, but should log to file")
Edit: The handlers have to be added before the code in the submodule runs. To effect this the import submodule was moved after the code that sets up the handlers.
Normally modules shouldn't have any top level logging calls so all the imports can be done at the top and then callables that use logging are called indirectly by the code in the if __name__=="__main__": after it sets up logging.
Related
I am trying to create logs for errors. This is the logger i am using.
import logging
import os
def create_log(source):
logging.basicConfig(filename="logs/"+source+".log",
format='%(asctime)s::%(levelname)s::%(message)s',
filemode='a')
logger = logging.getLogger()
logger.setLevel(logging.INFO)
def info_logger(message):
logger.info(message)
def error_logger(message):
print(message)
logger.error(message)
I am calling this logger in a for loop where i am doing some operation and trying to create logs for each iteration
for i in data["source_id"]:
--Some task here--
log_file_name = str(source_dict["source_id"]) + "_" + source_dict["source_name"] + "_"+str(datetime.today().strftime("%Y-%m-%d_%H_%M_%S"))
create_log(log_file_name)
for the first iteration, log file is getting created. But for other iterations, the same log file is getting appended. I want to make seperate log files for each iteration. Any idea how can i do that?
You can try this
import logging
debug = logging.FileHandler("debug.log")
debug.setLevel(logging.DEBUG)
error = logging.FileHandler("error.log")
error.setLevel(logging.ERROR)
warning = logging.FileHandler("warning.log")
warning.setLevel(logging.WARNING)
console = logging.StreamHandler()
logging.basicConfig( # noqa
level=logging.INFO,
format="[%(asctime)s]:%(levelname)s %(name)s :%(module)s/%(funcName)s,%(lineno)d: %(message)s",
handlers=[debug, error, warning, console]
)
logger = logging.getLogger()
logger.debug("This is debug [error+warning]")
logger.error("This is error [error only]")
logger.warning("This is warn [error+warning]")
I have a project that consists of several modules. There is main module (main.py) that creates a TK GUI and loads the data. It passes this data to process.py which processes the data using functions from checks.py. I am trying to implement logging for all the modules to log to a file. In the main.py log messages are written to the log file but in the other modules they are only written to the console. I assume its to do with the import module line executing part of the code before the code in main.py has set up the logger, but i can't work out how to arrange it to avoid that. It seems like a reasonably common question on Stackoverflow, but i couldn't get the other answers to work for me. I am sure I am not missing much. Simplified code is shown below:
Moving the logging code inside and outside of various functions in the modules. The code I used to start me off is the code from Corey Schaffer's Youtube channel.
Main.py
import logging
from process import process_data
def main():
logger = logging.getLogger()
logger.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(asctime)s:%(name)s:%(message)s')
templogfile = tempfile.gettempdir() + '\\' + 'TST_HA_Debug.log'
file_handler = logging.FileHandler(templogfile)
file_handler.setLevel(logging.DEBUG)
file_handler.setFormatter(formatter)
stream_handler = logging.StreamHandler()
stream_handler.setFormatter(formatter)
logger.addHandler(file_handler)
logger.addHandler(stream_handler)
logger.debug('Logging has started') # This gets written to the file
process_data(data_object) # call process_data in process.py
process.py
import logging
def process_data(data):
logger = logging.getLogger(__name__)
logger.debug('This message is logged by process') #This wont get written to the log file but get written to the console
#do some stuff with data here and log some msgs
return
Main.py will write to the log file, process.py will only write to the console.
I've rewritten your scripts a little so that this code can stand alone. If I changed this too much let me know and I can revisit it. These two files are an example of having it log to file. Note my comments:
## main.py
import logging
from process import process_data
import os
def main():
# Give this logger a name
logger = logging.getLogger("Example")
logger.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(asctime)s:%(name)s:%(message)s')
# I changed this to the same directory, for convenience
templogfile = os.path.join(os.getcwd(), 'TST_HA_Debug.log')
file_handler = logging.FileHandler(templogfile)
file_handler.setLevel(logging.DEBUG)
file_handler.setFormatter(formatter)
stream_handler = logging.StreamHandler()
stream_handler.setFormatter(formatter)
logger.addHandler(file_handler)
logger.addHandler(stream_handler)
logging.getLogger("Example").debug('Logging has started') # This still gets written to the file
process_data() # call process_data in process.py
if __name__ == '__main__':
main()
## process.py
import logging
def process_data(data=None):
# make sure to grab the correct logger
logger = logging.getLogger("Example")
logger.debug('This message is logged by process') # This does get logged to file now
#do some stuff with data here and log some msgs
return
Why does this work? Because the module-level functions use the default root logger, which is not the one you've configured. For more details on this see these docs. There is a similar question that goes more into depth here.
By getting the configured logger before you start logging, you are able to log to the right configuration. Hope this helps!
I have a very simple structure. But only one of my two logging handlers is logging from my modules:
program.py,
support_module1.py,
support_module2.py
#program.py
import support_module1 as SM1
import support_module1 as SM2
log = logging.getLogger(__name__)
logging.basicConfig(
filename='/logs/TestLog.log',
filemode='w',
level='DEBUG',
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
handlers=[logging.FileHandler(r'/logs/TestLog.log')])
stdout_handler = logging.StreamHandler(sys.stdout)
stdout_handler.setLevel(logging.INFO)
log.addHandler(stdout_handler)
log.debug("shows in file")
log.info("shows in file and in stdout")
SM1.function1()
SM2.function2()
Modules
#support_module1.py
mod1_log = logging.getLogger(__name__)
function1():
mod1_log.debug("shows in file")
mod1_log.info("should show in file and in stdout, but only goes to file")
#support_module2.py
mod2_log = logging.getLogger(__name__)
function2():
mod2_log.debug("shows in file")
mod2_log.info("should show in file and in stdout, but only goes to file")
When I run I get:
shows in file and in stdout
I'm expecting:
shows in file and in stdout
should show in file and in stdout, but only goes to file
should show in file and in stdout, but only goes to file
Anyone tell me what i'm doing wrong?
hoefling pefectly explained why and how to fix. Thank you!
In program.py, you are configuring logging.getLogger(name). This will affect only the logger named program.py and thus only log records inside program.py itself. The logging.getLogger(name) inside module1.py will return a different logger named module1.py which is unaffected by the configuration in program.py The fix is very simple - replace logging.getLogger(name) with logging.getLogger() in program.py. This will configure the root logger instead.
-hoefling
Is there any way I can provide the filename for logger from my main module?
I am using following way, however it's not working.all the logs go to xyz.log file rather than main.log
Updated as per suggestion from nosklo
logger.py
formatter = logging.Formatter(fmt='[%(asctime)s] - {%(filename)s:%(lineno)d} %(levelname)s - %(message)s')
def _get_file_handler(file_name="xyz.log"):
file_handler = logging.FileHandler(file_name)
file_handler.setLevel(logging.DEBUG)
file_handler.setFormatter(formatter)
return file_handler
def get_logger(name):
logger = logging.getLogger(name)
logger.setLevel(logging.DEBUG)
logger.addHandler(_get_file_handler())
return logger
parser.py
log = logger.get_logger(__name__)
def parse():
log.info("is there anyway this could go to main.log and xyz.log")
main.py
log = logger.get_logger(__name__)
if __name__ == '__main__':
for handler in log.handlers:
if isinstance(handler, logging.FileHandler):
log.removeHandler(handler)
log.addHandler(logger._get_file_handler())
log.info("is there anyway this could go to main.log and xyz.log?")
parser.parse()
Is there a way I can set the Log file name from my main.py module and not from logger.py module?
You're calling get_logger() first, so when you set the class attribute in FileName.file_name = "main.log" the get_logger function is already finished, and the logger is already defined to write in xyz.log; Changing the variable later won't change the logger anymore, since it is already defined.
To change the previously selected file, you'd have to retrieve the logger, remove the previous handler and add a new file handler. Another option is to set the variable before calling get_logger() so when you call it, the variable already has the correct value.
Logging instances can have multiple file handlers. Use a function like this to just add another handler with the additional output path you want. Log messages will get sent to both (or all) text logs added to the instance. You can even configure the handlers to have different logging levels so you can filter messages to different logs for critical errors, info message, etc.
import logging
def add_handler(output_log_path, log):
# Set up text logger and add it to logging instance
file_logger = logging.FileHandler(output_log_path)
file_logger.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(asctime)s | logger name: %(name)s | module: %(module)s | lineno: %(lineno)d | %(message)s')
file_logger.setFormatter(formatter)
log.addHandler(file_logger)
return log
I am trying to print logs using logger module in python.
Following is the code I am keeping on the top of file.
if __name__ == '__main__':
LOG_FILENAME = '/home/akash/exdion-pdf-extracter/doc/epod.log'
logging.basicConfig(
filename=LOG_FILENAME,
level=logging.DEBUG,
)
There are different files with function calls from one another. I have used the following line to display a line in the logger.
#staticmethod
def initiate_pdf_processing(ct_doc, pt_doc, feature, startAndEndKeyList):
logging.info("testing logger")
...
There are multiple instances of the similar above logger function. But I can't receive the logger output in the designated file. The code and files are huge. However there are a few error generated by the code which are getting printed in the log file.
Use below code bit out of the main namespace. This way, you are defining a logger and creating a log file as global file, and you can call the logger anywhere in the code. A logger code bit below is how I usually code.
logfile = '<your_file_name>.log'
if(os.path.isfile(logfile)):
os.remove(logfile)
file_handler = logging.FileHandler(logfile)
file_handler.setLevel(logging.DEBUG)
file_handler.setFormatter(logging.Formatter(
'%(asctime)s %(pathname)s [%(process)d]: %(levelname)s:: %(message)s'))
logger = logging.getLogger('wbs-server-log')
logger.setLevel(logging.DEBUG)
logger.addHandler(file_handler)
The issue might be that you have to initialize logging above if __name__ == '__main__' block. That way logging will be initialized when you import this as module.
Suggestion for initializing logging:
import logging
log = logging.getLogger(PACKAGE_NAME)
stream_handler = logging.StreamHandler(stream=open(LOG_FILE_NAME, 'a'))
stream_handler.setLevel(logging.DEBUG)
log.addHandler(stream_handler)
log.debug('your message here')
After this you can tweak log message formatting with logging.Formatter.