This question already has answers here:
Redirect Python 'print' output to Logger
(7 answers)
Closed 6 years ago.
import logging
import sys
log_fmt = 'brbuild: %(message)s'
# Initilaize log here
# TODO may need to flush
logging.basicConfig(filename="logtest",
level=logging.DEBUG,
format=log_fmt,
datefmt='%H:%M:%S',
filemode='a')
# capture stdout to log
ch = logging.StreamHandler(sys.stdout)
ch.setLevel(logging.DEBUG)
log_fmt = logging.Formatter(log_fmt)
ch.setFormatter(log_fmt)
logging.getLogger("logtest").addHandler(ch)
logging.info("using logging")
print "using stdout"
logtest
brbuild: using logging
how can i get "using stdout" to be written in the log as well?
That's a kind of a hack but you could redefine print in the current module, and others module could perform a from foo import print to be able to use it.
For simplicity's sake, I haven't used file handles in that example, but stdout/stderr. If you use files, you can still add a sys.stdout.write(msg+os.linesep) statement to your new print function.
my new print may not be as powerful as the original print but it supports multiple arguments as well.
import logging,sys
def print(*args):
logger.info(" ".join([str(x) for x in args]))
if __name__ == '__main__':
logger = logging.getLogger('foo')
logger.addHandler(logging.StreamHandler(sys.stdout))
logger.addHandler(logging.StreamHandler(sys.stderr))
logger.setLevel(logging.INFO)
a=12
logger.info('1. This should appear in both stdout and stderr.')
print("logging works!",a)
(you have to use it with parentheses). Result:
1. This should appear in both stdout and stderr.
1. This should appear in both stdout and stderr.
logging works! 12
logging works! 12
If your intention is to redirect the print output (i.e. redirect sys.stdout) to logger, or to both the logger and the standard output, you will need to create a class that mimics a file-like object to do that and assign an instance of that file-like object class to sys.stdout.
Related
Using Python 3 logging, how can I specify that logging.debug() and logging.info() should go to stdout and logging.warning() and logging.error() go to stderr?
You can create separate loggers for each stream like this:
import logging
import sys
logging.basicConfig(format="%(levelname)s %(message)s")
stdout_logger = logging.Logger(name="stdout_logger", level=logging.DEBUG)
stderr_logger = logging.Logger(name="stderr_logger", level=logging.DEBUG)
stdout_handler = logging.StreamHandler(stream=sys.stdout)
stderr_handler = logging.StreamHandler(stream=sys.stderr)
stdout_logger.addHandler(hdlr=stdout_handler)
stderr_logger.addHandler(hdlr=stderr_handler)
stdout_logger.info("this will output to stdout")
stderr_logger.info("this will output to stderr")
Then if you want to log something on 'debug' or 'info' level you can just use stdout_logger. For 'warning' and 'error' level messages use stderr_logger.
How to do this kind of thing is documented in the logging cookbook, and though the example levels/destinations in the cookbook recipe are slightly different from those in your question, there should be enough information there for you to arrive at a solution. The key thing is using a filter function and attaching it to a handler.
I am using pytest to test a CLI that produces some output. While running the test, I want to set my CLI's log level to DEBUG. However, I don't want CLI logs to interfere with tests that are parsing the output of the CLI.
How can I make logging module send all the logs to only stderr? I looked at this post but it talks about sending logs to stderr in addition to stdout.
You can direct the log output for a given logger to stderr as follows. This defaults to stderr for output, but you can use sys.stdout instead if you prefer.
import logging
import sys
DEFAULT_LOGGER_NAME = 'default_logger'
def init_logging(logger_name=DEFAULT_LOGGER_NAME,
log_level=logging.DEBUG,
stream=None):
# logging
logger = logging.getLogger(logger_name)
logger.setLevel(log_level)
# create formatter
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
# create console handler and set level to debug
if stream is None:
stream = sys.stderr
ch = logging.StreamHandler(stream=stream)
ch.setLevel(log_level)
# add formatter to ch
ch.setFormatter(formatter)
# add ch to logger
logger.addHandler(ch)
return logger
This function needs to be called at the beginning of your program (e.g. beginning of main()).
Then within the code, you just need to call the following:
logger = logging.getLogger(LOGGER_NAME)
Do the same that the linked answer does but replace stdout with stderr. So you would create a handler with logging.StreamHandler(sys.stderr) and make sure this is the only active handler if you want to exclusively have logs go to stderr.
As #Tomerikoo correctly points out you don't need to do anything though as logging defaults to using a StreamHandler with stderr. The only real value of the code below is that it sets a different level than the default. Just logging.warning('log') with no other setup will send a log to stderr.
Addendum: you can also achieve this using basicConfig to have less boilerplate code.
import logging
import sys
logging.basicConfig(stream=sys.stderr, level=logging.INFO)
logging.info('test') # sends log to stderr
So let's say I've got a piece of code that looks like this:
main.py
def get_stdout():
sys.stdout = open(str(os.getpid()) + ".out", "w")
foo.foo()
p = Process(target=get_stdout)
p.start()
foo.py
def foo():
my_logger.info('LOG INFO HERE')
my_logger = logging.getLogger()
my_logger.setLevel(logging.DEBUG)
logHandler = logging.StreamHandler()
logHandler.setFormatter(logging.Formatter('LOG: - %(asctime)s - %(name)s - %(levelname)s - %(message)s'))
my_logger.addHandler(logHandler)
The logger is defined at the bottom the foo module. When I call python main.py, the intention is to spawn a subprocess that calls foo() from the foo module and capture its log output and write it to a file. This example doesn't work because the output stream of the logger object is defined when the module is first initialized, so it just gets written to terminal and not to the file.
What's the best way to get around this? Right now, each module has only a single instance of the logger class and I'm sure there's a better way to do this, but I'm drawing a blank on being able to use the logging module and still be able to isolate loglines from separate processes.
By default, log messages are written to sys.stderr, not sys.stdout, so you need to redirect stderr instead:
def get_stdout(): # Maybe rename this
sys.stderr = open(str(os.getpid()) + ".out", "w")
foo.foo()
This question already has answers here:
How can INFO and DEBUG logging message be sent to stdout and higher level message to stderr
(9 answers)
Closed 9 years ago.
Is it possible to have python logging messages which are INFO or DEBUG to go to stdout and WARNING or greater to go to stderr?
This seems to do what I want:
#!/usr/bin/python
import sys
import logging
class InfoFilter(logging.Filter):
def filter(self, rec):
return rec.levelno in (logging.DEBUG, logging.INFO)
logger = logging.getLogger("__name__")
logger.setLevel(logging.DEBUG)
h1 = logging.StreamHandler(sys.stdout)
h1.setLevel(logging.DEBUG)
h1.addFilter(InfoFilter())
h2 = logging.StreamHandler()
h2.setLevel(logging.WARNING)
logger.addHandler(h1)
logger.addHandler(h2)
I thought this would help: Handler.setLevel(lvl)
Sets the threshold for this handler to lvl. Logging messages which are less severe than lvl will be ignored. When a handler is created, the level is set to NOTSET (which causes all messages to be processed).
But now I see that it wouldn't do what you want (split INFO/DEBUG from WARNING/ERROR)
That being said, you could write a custom handler (a class extending logging.StreamHandler for example), and overwrite the Handler.handle() method.
I have a Python script that makes use of 'Print' for printing to stdout. I've recently added logging via Python Logger and would like to make it so these print statements go to logger if logging is enabled. I do not want to modify or remove these print statements.
I can log by doing 'log.info("some info msg")'. I want to be able to do something like this:
if logging_enabled:
sys.stdout=log.info
print("test")
If logging is enabled, "test" should be logged as if I did log.info("test"). If logging isn't enabled, "test" should just be printed to the screen.
Is this possible? I know I can direct stdout to a file in a similar manner (see: redirect prints to log file)
You have two options:
Open a logfile and replace sys.stdout with it, not a function:
log = open("myprog.log", "a")
sys.stdout = log
>>> print("Hello")
>>> # nothing is printed because it goes to the log file instead.
Replace print with your log function:
# If you're using python 2.x, uncomment the next line
#from __future__ import print_function
print = log.info
>>> print("Hello!")
>>> # nothing is printed because log.info is called instead of print
Of course, you can both print to the standard output and append to a log file, like this:
# Uncomment the line below for python 2.x
#from __future__ import print_function
import logging
logging.basicConfig(level=logging.INFO, format='%(message)s')
logger = logging.getLogger()
logger.addHandler(logging.FileHandler('test.log', 'a'))
print = logger.info
print('yo!')
One more method is to wrap the logger in an object that translates calls to write to the logger's log method.
Ferry Boender does just this, provided under the GPL license in a post on his website. The code below is based on this but solves two issues with the original:
The class doesn't implement the flush method which is called when the program exits.
The class doesn't buffer the writes on newline as io.TextIOWrapper objects are supposed to which results in newlines at odd points.
import logging
import sys
class StreamToLogger(object):
"""
Fake file-like stream object that redirects writes to a logger instance.
"""
def __init__(self, logger, log_level=logging.INFO):
self.logger = logger
self.log_level = log_level
self.linebuf = ''
def write(self, buf):
temp_linebuf = self.linebuf + buf
self.linebuf = ''
for line in temp_linebuf.splitlines(True):
# From the io.TextIOWrapper docs:
# On output, if newline is None, any '\n' characters written
# are translated to the system default line separator.
# By default sys.stdout.write() expects '\n' newlines and then
# translates them so this is still cross platform.
if line[-1] == '\n':
self.logger.log(self.log_level, line.rstrip())
else:
self.linebuf += line
def flush(self):
if self.linebuf != '':
self.logger.log(self.log_level, self.linebuf.rstrip())
self.linebuf = ''
logging.basicConfig(
level=logging.DEBUG,
format='%(asctime)s:%(levelname)s:%(name)s:%(message)s',
filename="out.log",
filemode='a'
)
stdout_logger = logging.getLogger('STDOUT')
sl = StreamToLogger(stdout_logger, logging.INFO)
sys.stdout = sl
stderr_logger = logging.getLogger('STDERR')
sl = StreamToLogger(stderr_logger, logging.ERROR)
sys.stderr = sl
This allows you to easily route all output to a logger of your choice. If needed, you can save sys.stdout and/or sys.stderr as mentioned by others in this thread before replacing it if you need to restore it later.
A much simpler option,
import logging, sys
logging.basicConfig(filename='path/to/logfile', level=logging.DEBUG)
logger = logging.getLogger()
sys.stderr.write = logger.error
sys.stdout.write = logger.info
Once your defined your logger, use this to make print redirect to logger even with mutiple parameters of print.
print = lambda *tup : logger.info(str(" ".join([str(x) for x in tup])))
You really should do that the other way: by adjusting your logging configuration to use print statements or something else, depending on the settings. Do not overwrite print behaviour, as some of the settings that may be introduced in the future (eg. by you or by someone else using your module) may actually output it to the stdout and you will have problems.
There is a handler that is supposed to redirect your log messages to proper stream (file, stdout or anything else file-like). It is called StreamHandler and it is bundled with logging module.
So basically in my opinion you should do, what you stated you don't want to do: replace print statements with actual logging.
Below snipped works perfectly inside my PySpark code. If someone need in case for understanding -->
import os
import sys
import logging
import logging.handlers
log = logging.getLogger(__name_)
handler = logging.FileHandler("spam.log")
formatter = logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s")
handler.setFormatter(formatter)
log.addHandler(handler)
sys.stderr.write = log.error
sys.stdout.write = log.info
(will log every error in "spam.log" in the same directory, nothing will be on console/stdout)
(will log every info in "spam.log" in the same directory,nothing will be on console/stdout)
to print output error/info in both file as well as in console remove above two line.
Happy Coding
Cheers!!!