Python logging stdout and stderr based on level - python

Using Python 3 logging, how can I specify that logging.debug() and logging.info() should go to stdout and logging.warning() and logging.error() go to stderr?

You can create separate loggers for each stream like this:
import logging
import sys
logging.basicConfig(format="%(levelname)s %(message)s")
stdout_logger = logging.Logger(name="stdout_logger", level=logging.DEBUG)
stderr_logger = logging.Logger(name="stderr_logger", level=logging.DEBUG)
stdout_handler = logging.StreamHandler(stream=sys.stdout)
stderr_handler = logging.StreamHandler(stream=sys.stderr)
stdout_logger.addHandler(hdlr=stdout_handler)
stderr_logger.addHandler(hdlr=stderr_handler)
stdout_logger.info("this will output to stdout")
stderr_logger.info("this will output to stderr")
Then if you want to log something on 'debug' or 'info' level you can just use stdout_logger. For 'warning' and 'error' level messages use stderr_logger.

How to do this kind of thing is documented in the logging cookbook, and though the example levels/destinations in the cookbook recipe are slightly different from those in your question, there should be enough information there for you to arrive at a solution. The key thing is using a filter function and attaching it to a handler.

Related

Python default and multi level logging

From what I read and understood, Python logging module by default logs to stderr.
If I run this python code:
import logging
logging.info('test')
logging.warning('test')
logging.error('test')
logging.debug('test')
as
python main.py 1> stdout.txt 2> stderr.txt
I get my logs in stderr.txt and nothing in stdout.txt - my logs are redirected to stderr.
This default behaviour is problematic when logs are streamed to logging aggregation services such as datadog or papertrail. Since its streamed to stderr, the logs are marked as errors when in reality they are not.
So I tried to create multiple log handlers as follows:
import logging
import sys
stdoutHandler = logging.StreamHandler(stream=sys.stdout)
stderrHandler = logging.StreamHandler(stream=sys.stderr)
logging.basicConfig(level=logging.DEBUG, handlers=[stdoutHandler, stderrHandler])
stdoutHandler.setLevel(logging.DEBUG)
stderrHandler.setLevel(logging.ERROR)
logging.info('test')
logging.warning('test')
logging.error('test')
logging.debug('test')
When I run this code, I get errors in sterr.txt but also all the logs in stdout.txt - I ended up having log duplication error logs appear in both the stderr and stdout streams.
Is there a better way to handle the differentiation of error logs from the rest in Python?
I tried loguru package as well, also no luck in stream separation...
Thanks in advance
You can probably benefit from the approach described here in the official documentation.
Following the approach described in the official documentation:
https://docs.python.org/3.10/howto/logging-cookbook.html#custom-handling-of-levels
Here's my code with stream segregation:
import sys
import logging
class StdoutFilter(logging.Filter):
def filter(self, record):
return record.levelno >= ROOT_LEVEL and record.levelno < logging.ERROR
class StderrFilter(logging.Filter):
def filter(self, record):
return record.levelno >= logging.ERROR
stdoutHandler = logging.StreamHandler(stream=sys.stdout)
stdoutHandler.addFilter(StdoutFilter())
stderrHandler = logging.StreamHandler(stream=sys.stderr)
stderrHandler.addFilter(StderrFilter())
root_logger = logging.getLogger()
root_logger.setLevel(logging.DEBUG)
root_logger.addHandler(stderrHandler)
root_logger.addHandler(stdoutHandler)

Use different logger handlers for different levels

I'd like to send all logs (including debug) to a file and to print only logs with a level of info or higher to stderr.
Currently, every module begins with this:
logger = logging.getLogger(__name__)
I haven't been able to achieve this without duplication of the logs (from the child and root loggers) and without modification of every module in my application.
What's the simplest way to do it?
You can configure two different handlers:
import logging
from logging import FileHandler, StreamHandler, INFO, DEBUG
file_handler = FileHandler("info.log")
file_handler.setLevel(DEBUG)
std_err_handler = StreamHandler()
std_err_handler.setLevel(INFO)
logging.basicConfig(handlers=[file_handler, std_err_handler], level=DEBUG)
logging.info("info") # writes to file and stderr
logging.debug("debug") # writes only to file

Python logger not printing Info

Looking at the python docs, if I set my logger level to INFO, it should print out all logs at level INFO and above.
However, the code snipper below only prints "error"
import logging
logger = logging.getLogger()
logger.setLevel(logging.INFO)
logger.info("Info")
logger.error("error")
logger.info("info")
Output
error
What could be the reason for this?
Use logging.basicConfig to set a default level and a default handler:
import logging
logger = logging.getLogger()
logging.basicConfig(level=logging.INFO)
logger.info("Info")
logger.error("error")
logger.info("info")
prints:
INFO:root:Info
ERROR:root:error
INFO:root:info
The logging module is powerful yet confusing. Look into the HOWTO in the docs for a tutorial. I've made my own helper function that logs to stderr and a file that I've detailed on my blog. You might like to adapt it to your needs.
The reason for this behaviour is that there are no logging handlers defined.
In this case the handler "logging.lastResort" is used.
Per default this handler is "<_StderrHandler (WARNING)>".
It logs to stderr but only starting from level "WARNING".
For your example you could do the following (logging to stdout):
import logging
import sys
logger = logging.getLogger()
logger.setLevel(logging.INFO)
logger.addHandler(logging.StreamHandler(sys.stdout))
logger.info("info")
Output
info
In the howto you can find other useful handlers.

Making logging module write all logs to only stderr

I am using pytest to test a CLI that produces some output. While running the test, I want to set my CLI's log level to DEBUG. However, I don't want CLI logs to interfere with tests that are parsing the output of the CLI.
How can I make logging module send all the logs to only stderr? I looked at this post but it talks about sending logs to stderr in addition to stdout.
You can direct the log output for a given logger to stderr as follows. This defaults to stderr for output, but you can use sys.stdout instead if you prefer.
import logging
import sys
DEFAULT_LOGGER_NAME = 'default_logger'
def init_logging(logger_name=DEFAULT_LOGGER_NAME,
log_level=logging.DEBUG,
stream=None):
# logging
logger = logging.getLogger(logger_name)
logger.setLevel(log_level)
# create formatter
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
# create console handler and set level to debug
if stream is None:
stream = sys.stderr
ch = logging.StreamHandler(stream=stream)
ch.setLevel(log_level)
# add formatter to ch
ch.setFormatter(formatter)
# add ch to logger
logger.addHandler(ch)
return logger
This function needs to be called at the beginning of your program (e.g. beginning of main()).
Then within the code, you just need to call the following:
logger = logging.getLogger(LOGGER_NAME)
Do the same that the linked answer does but replace stdout with stderr. So you would create a handler with logging.StreamHandler(sys.stderr) and make sure this is the only active handler if you want to exclusively have logs go to stderr.
As #Tomerikoo correctly points out you don't need to do anything though as logging defaults to using a StreamHandler with stderr. The only real value of the code below is that it sets a different level than the default. Just logging.warning('log') with no other setup will send a log to stderr.
Addendum: you can also achieve this using basicConfig to have less boilerplate code.
import logging
import sys
logging.basicConfig(stream=sys.stderr, level=logging.INFO)
logging.info('test') # sends log to stderr

Extending the Python Logger

I'm looking for a simple way to extend the logging functionality defined in the standard python library. I just want the ability to choose whether or not my logs are also printed to the screen.
Example: Normally to log a warning you would call:
logging.basicConfig(level=logging.DEBUG, format='%(asctime)s %(levelname)s: %(message)s', filename='log.log', filemode='w')
logging.warning("WARNING!!!")
This sets the configurations of the log and puts the warning into the log
I would like to have something along the lines of a call like:
logging.warning("WARNING!!!", True)
where the True statement signifys if the log is also printed to stdout.
I've seen some examples of implementations of overriding the logger class
but I am new to the language and don't really follow what is going on, or how to implement this idea. Any help would be greatly appreciated :)
The Python logging module defines these classes:
Loggers that emit log messages.
Handlers that put those messages to a destination.
Formatters that format log messages.
Filters that filter log messages.
A Logger can have Handlers. You add them by invoking the addHandler() method. A Handler can have Filters and Formatters. You similarly add them by invoking the addFilter() and setFormatter() methods, respectively.
It works like this:
import logging
# make a logger
main_logger = logging.getLogger("my logger")
main_logger.setLevel(logging.INFO)
# make some handlers
console_handler = logging.StreamHandler() # by default, sys.stderr
file_handler = logging.FileHandler("my_log_file.txt")
# set logging levels
console_handler.setLevel(logging.WARNING)
file_handler.setLevel(logging.INFO)
# add handlers to logger
main_logger.addHandler(console_handler)
main_logger.addHandler(file_handler)
Now, you can use this object like this:
main_logger.info("logged in the FILE")
main_logger.warning("logged in the FILE and on the CONSOLE")
If you just run python on your machine, you can type the above code into the interactive console and you should see the output. The log file will get crated in your current directory, if you have permissions to create files in it.
I hope this helps!
It is possible to override logging.getLoggerClass() to add new functionality to loggers. I wrote simple class which prints green messages in stdout.
Most important parts of my code:
class ColorLogger(logging.getLoggerClass()):
__GREEN = '\033[0;32m%s\033[0m'
__FORMAT = {
'fmt': '%(asctime)s %(levelname)s: %(message)s',
'datefmt': '%Y-%m-%d %H:%M:%S',
}
def __init__(self, format=__FORMAT):
formatter = logging.Formatter(**format)
self.root.setLevel(logging.INFO)
self.root.handlers = []
(...)
handler = logging.StreamHandler()
handler.setFormatter(formatter)
self.root.addHandler(handler)
def info(self, message):
self.root.info(message)
(...)
def info_green(self, message):
self.root.info(self.__GREEN, message)
(...)
if __name__ == '__main__':
logger = ColorLogger()
logger.info("This message has default color.")
logger.info_green("This message is green.")
Handlers send the log records (created by loggers) to the appropriate
destination.
(from the docs: http://docs.python.org/library/logging.html)
Just set up multiple handlers with your logging object, one to write to file, another to write to the screen.
UPDATE
Here is an example function you can call in your classes to get logging set up with a handler.
def set_up_logger(self):
# create logger object
self.log = logging.getLogger("command")
self.log.setLevel(logging.DEBUG)
# create console handler and set min level recorded to debug messages
ch = logging.StreamHandler()
ch.setLevel(logging.DEBUG)
# add the handler to the log object
self.log.addHandler(ch)
You would just need to set up another handler for files, ala the StreamHandler code that's already there, and add it to the logging object. The line that says ch.setLevel(logging.DEBUG) means that this particular handler will take logging messages that are DEBUG or higher. You'll likely want to set yours to WARNING or higher, since you only want the more important things to go to the console. So, your logging would work like this:
self.log.info("Hello, World!") -> goes to file
self.log.error("OMG!!") -> goes to file AND console

Categories