I've setup a logger using the following code
def setup_logging():
import logging
import logging.handlers
import os
#from time import gmtime, strftime
#import logging.handlers
logger = logging.getLogger('apt')
logger.setLevel(logging.DEBUG)
# create file handler
fh = logging.handlers.RotatingFileHandler(os.path.join('..','logs','apt.log'), maxBytes=1000000, backupCount=5)
fh.setLevel(logging.DEBUG)
# create console handler
ch = logging.StreamHandler()
ch.setLevel(logging.DEBUG)
# create formatter and add it to the handlers
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
ch.setFormatter(formatter)
fh.setFormatter(formatter)
# add the handlers to logger
logger.addHandler(ch)
logger.addHandler(fh)
The prints the log messages to both a file and the console (which is what I was after). The only issue is that the console message are red. This is distracting since red makes everything look like an error (when it just info). How can I change it so that the console messages are a different color?
Ideally, black for debug and info, red for warning and above.
I'm using Eclipse and PyDev.
The PyDev console highlights messages to stderr in red by default. Python's logging.DEBUG will send messages to stderr. If you wish to change this behavior, see this post: Logging, StreamHandler and standard streams
To change the colors in PyDev, see here: http://pydev.org/manual_adv_interactive_console.html
Related
My logging setup is:
import coloredlogs
import logging
import sys
# Create a logger object.
# logger = logging.getLogger(__name__)
# By default the install() function installs a handler on the root logger,
# this means that log messages from your code and log messages from the
# libraries that you use will all show up on the terminal.
coloredlogs.install(level='DEBUG')
logging.basicConfig(
format='%(asctime)s %(levelname)-8s %(message)s',
level=logging.INFO,
stream=sys.stdout,
datefmt='%Y-%m-%d %H:%M:%S')
If I configure the console to use python for output, all lines start at column 0, but all output is red. But if I specify use terminal, the colors are there, but the lines don't start at column 1. They all start with the end column of the line before.
How can I get all log messages starting at column 0 AND in color?
Try adding isatty to your call to install. This overrides the whatever auto-detection is doing to attempt at determining what type of terminal is being used.
import coloredlogs
import logging
import sys
logger = logging.getLogger(__name__)
coloredlogs.install(level=logging.DEBUG, logger=logger, isatty=True,
fmt="%(asctime)s %(levelname)-8s %(message)s",
stream=sys.stdout,
datefmt='%Y-%m-%d %H:%M:%S')
logger.debug("this is a debugging message")
logger.info("this is an informational message")
logger.warning("this is a warning message")
logger.error("this is an error message")
logger.critical("this is a critical message")
explanation:
If I understood the problem correctly, you are seeing the logging handler default to using sys.stderr instead of sys.stdout. (this is why the text would appear in red)
There are likely two issues going on.
coloredlogs.install needs to be told what type of rules you want in the handler, not logging.basicConfig.
The automatic terminal detection would fail unless you either force the formatter to think it's a terminal or you set pycharm to 'simulate' a terminal. Fixed by passing in stream=sys.stdout
I would like to save in superwrapper.log all logging lines but only show in console the INFO.
If I remove the # of filename line , the file is okey but I don't see anything in the console.
if __name__ == '__main__':
logging.basicConfig(
#filename='superwrapper.log',
level=logging.DEBUG,
format='%(asctime)s.%(msecs)03d %(levelname)s %(module)s - %(funcName)s: %(message)s',
datefmt='%Y-%m-%d %H:%M:%S'
)
2020-04-28 11:41:09.698 INFO common - handle_elasticsearch: Elastic connection detected
2020-04-28 11:41:09.699 INFO superwrapper - <module>: Cookies Loaded: |TRUE|
2020-04-28 11:41:09.715 DEBUG connectionpool - _new_conn: Starting new HTTPS connection (1): m.facebook.com:443
You can use multiple handlers. logging.basicConfig can accept handlers as an argument starting in Python 3.3. One is required for logging to the log file and one to the console. You can also set the handlers to have different logging levels. The simplest way I can think of is to do this:
import logging
import sys
file_handler = logging.FileHandler('superwrapper.log')
console_handler = logging.StreamHandler(sys.stdout)
console_handler.setLevel(logging.INFO)
logging.basicConfig(
level=logging.DEBUG,
format='%(asctime)s.%(msecs)03d %(levelname)s %(module)s - %(funcName)s: %(message)s',
datefmt='%Y-%m-%d %H:%M:%S',
handlers=[
file_handler,
console_handler
]
)
One thing to note is the StreamHandler writes to strerr. Usually you will want to override this with sys.stdout
You can set up multiple loggers. This will get rid of the DEBUG messages, but note that messages of a higher severity will still be broadcast (e.g. 'WARNING' and 'ERROR').
This exact scenario is in the logging cookbook of the Python docs:
import logging
# set up logging to file - see previous section for more details
logging.basicConfig(level=logging.DEBUG,
format='%(asctime)s %(name)-12s %(levelname)-8s %(message)s',
datefmt='%m-%d %H:%M',
filename='/temp/myapp.log',
filemode='w')
# define a Handler which writes INFO messages or higher to the sys.stderr
console = logging.StreamHandler()
console.setLevel(logging.INFO)
# set a format which is simpler for console use
formatter = logging.Formatter('%(name)-12s: %(levelname)-8s %(message)s')
# tell the handler to use this format
console.setFormatter(formatter)
# add the handler to the root logger
logging.getLogger('').addHandler(console)
# Now, we can log to the root logger, or any other logger. First the root...
logging.info('Jackdaws love my big sphinx of quartz.')
# Now, define a couple of other loggers which might represent areas in your
# application:
logger1 = logging.getLogger('myapp.area1')
logger2 = logging.getLogger('myapp.area2')
logger1.debug('Quick zephyrs blow, vexing daft Jim.')
logger1.info('How quickly daft jumping zebras vex.')
logger2.warning('Jail zesty vixen who grabbed pay from quack.')
logger2.error('The five boxing wizards jump quickly.')
In the example given by the cookbook, you should see in the console all the 'INFO', 'WARNING' and 'ERROR' messages, but only the log file will hold the 'DEBUG' message.
#Alan's answer was great, but it also wrote the message from the root logger, which was too much for me, because I was afraid that the size of log file would get out of control. So I modified it like below to only log what I specified and nothing extra from the imported modules.
import logging
# set up logging to file - see previous section for more details
logging.basicConfig(format='%(asctime)s %(name)-12s %(levelname)-8s %(message)s',
datefmt='%m-%d %H:%M',
filename='/temp/myapp.log',
filemode='w')
logger = logging.getLogger('myapp')
logger.setLevel(logging.DEBUG)
# define a Handler which writes INFO messages or higher to the sys.stderr
console = logging.StreamHandler()
console.setLevel(logging.INFO)
# set a format which is simpler for console use
formatter = logging.Formatter('%(name)-12s: %(levelname)-8s %(message)s')
# tell the handler to use this format
console.setFormatter(formatter)
# add the handler to your logger
logger.addHandler(console)
Then in my app, I logged like below just using logger (without any numbers).
logger.debug('Quick zephyrs blow, vexing daft Jim.')
logger.info('How quickly daft jumping zebras vex.')
logger.warning('Jail zesty vixen who grabbed pay from quack.')
logger.error('The five boxing wizards jump quickly.')
I wrote a python script which executes a while loop and requires a keyboard interrupt or system shutdown to terminate.
I would like my log file to save the log output; currently the log file gets created, but nothing gets written to it.
The following creates an output file with the contents I expect:
import logging
logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)
# create a file handler
handler = logging.FileHandler('hello.log')
# handler.setLevel(logging.INFO)
# create a logging format
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
handler.setFormatter(formatter)
# add the handlers to the logger
logger.addHandler(handler)
logger.info('Mmmm...donuts')
But when I integrate it into my code, the log file lacks any contents:
from logging import log, FileHandler, getLogger, Formatter, CRITICAL
logger = getLogger(__name__)
logger.setLevel(CRITICAL)
handler = FileHandler("test.log")
formatter = Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
handler.setFormatter(formatter)
logger.addHandler(formatter)
logger.info("start")
enter_while_loop()
I believe I should handle this using atexit, but how?
Thank you for your time and consideration.
I have create a global logger using the following:
def logini():
logfile='/var/log/cs_status.log'
import logging
import logging.handlers
global logger
logger = logging.getLogger()
logging.basicConfig(filename=logfile,filemode='a',format='%(asctime)s %(name)s %(levelname)s %(message)s',datefmt='%y%m%d-%H:%M:%S',level=logging.DEBUG,propagate=0)
handler = logging.handlers.RotatingFileHandler(logfile, maxBytes=2000000, backupCount=5)
logger.addHandler(handler)
__builtins__.logger = logger
It works, however I am getting 2 outputs for every log, one with the formatting and one without.
I realize that this is being caused by the file rotater as I can comment out the 2 lines of the handler code and then I get a single outputted correct log entry.
How can I prevent the log rotator from outputting a second entry ?
Currently you're configuring two file loggers that point to the same logfile. To only use the RotatingFileHandler, get rid of the basicConfig call:
logger = logging.getLogger()
handler = logging.handlers.RotatingFileHandler(logfile, maxBytes=2000000,
backupCount=5)
formatter = logging.Formatter(fmt='%(asctime)s %(name)s %(levelname)s %(message)s',
datefmt='%y%m%d-%H:%M:%S')
handler.setFormatter(formatter)
handler.setLevel(logging.DEBUG)
logger.addHandler(handler)
All basicConfig does for you is to provide an easy way to instantiate either a StreamHandler (default) or a FileHandler and set its loglevel and formats (see the docs for more information). If you need a handler other than these two, you should instantiate and configure it yourself.
When I running the following inside IPython Notebook I don't see any output:
import logging
logging.basicConfig(level=logging.DEBUG)
logging.debug("test")
Anyone know how to make it so I can see the "test" message inside the notebook?
Try following:
import logging
logger = logging.getLogger()
logger.setLevel(logging.DEBUG)
logging.debug("test")
According to logging.basicConfig:
Does basic configuration for the logging system by creating a
StreamHandler with a default Formatter and adding it to the root
logger. The functions debug(), info(), warning(), error() and
critical() will call basicConfig() automatically if no handlers are
defined for the root logger.
This function does nothing if the root logger already has handlers
configured for it.
It seems like ipython notebook call basicConfig (or set handler) somewhere.
If you still want to use basicConfig, reload the logging module like this
from importlib import reload # Not needed in Python 2
import logging
reload(logging)
logging.basicConfig(format='%(asctime)s %(levelname)s:%(message)s', level=logging.DEBUG, datefmt='%I:%M:%S')
My understanding is that the IPython session starts up logging so basicConfig doesn't work. Here is the setup that works for me (I wish this was not so gross looking since I want to use it for almost all my notebooks):
import logging
logger = logging.getLogger()
fhandler = logging.FileHandler(filename='mylog.log', mode='a')
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
fhandler.setFormatter(formatter)
logger.addHandler(fhandler)
logger.setLevel(logging.DEBUG)
Now when I run:
logging.error('hello!')
logging.debug('This is a debug message')
logging.info('this is an info message')
logging.warning('tbllalfhldfhd, warning.')
I get a "mylog.log" file in the same directory as my notebook that contains:
2015-01-28 09:49:25,026 - root - ERROR - hello!
2015-01-28 09:49:25,028 - root - DEBUG - This is a debug message
2015-01-28 09:49:25,029 - root - INFO - this is an info message
2015-01-28 09:49:25,032 - root - WARNING - tbllalfhldfhd, warning.
Note that if you rerun this without restarting the IPython session it will write duplicate entries to the file since there would now be two file handlers defined
Bear in mind that stderr is the default stream for the logging module, so in IPython and Jupyter notebooks you might not see anything unless you configure the stream to stdout:
import logging
import sys
logging.basicConfig(format='%(asctime)s | %(levelname)s : %(message)s',
level=logging.INFO, stream=sys.stdout)
logging.info('Hello world!')
What worked for me now (Jupyter, notebook server is: 5.4.1, IPython 7.0.1)
import logging
logging.basicConfig()
logger = logging.getLogger('Something')
logger.setLevel(logging.DEBUG)
Now I can use logger to print info, otherwise I would see only message from the default level (logging.WARNING) or above.
You can configure logging by running %config Application.log_level="INFO"
For more information, see IPython kernel options
I wanted a simple and straightforward answer to this, with nicely styled output so here's my recommendation
import sys
import logging
logging.basicConfig(
format='%(asctime)s [%(levelname)s] %(name)s - %(message)s',
level=logging.INFO,
datefmt='%Y-%m-%d %H:%M:%S',
stream=sys.stdout,
)
log = logging.getLogger('notebook')
Then you can use log.info() or any of the other logging levels anywhere in your notebook with output that looks like this
2020-10-28 17:07:08 [INFO] notebook - Hello world
2020-10-28 17:12:22 [INFO] notebook - More info here
2020-10-28 17:12:22 [INFO] notebook - And some more
As of logging version 3.8 a force parameter has been added that removes any existing handlers, which allows basicConfig to work. This worked on IPython version 7.29.0 and Jupyter Lab version 3.2.1.
import logging
logging.basicConfig(level=logging.DEBUG,
force = True)
logging.debug("test")
I setup a logger for both file and I wanted it to show up on the notebook. Turns out adding a filehandler clears out the default stream handlder.
logger = logging.getLogger()
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
# Setup file handler
fhandler = logging.FileHandler('my.log')
fhandler.setLevel(logging.DEBUG)
fhandler.setFormatter(formatter)
# Configure stream handler for the cells
chandler = logging.StreamHandler()
chandler.setLevel(logging.DEBUG)
chandler.setFormatter(formatter)
# Add both handlers
logger.addHandler(fhandler)
logger.addHandler(chandler)
logger.setLevel(logging.DEBUG)
# Show the handlers
logger.handlers
# Log Something
logger.info("Test info")
logger.debug("Test debug")
logger.error("Test error")
setup
import logging
# make a handler
handler = logging.StreamHandler()
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
handler.setFormatter(formatter)
# add it to the root logger
logging.getLogger().addHandler(handler)
log from your own logger
# make a logger for this notebook, set verbosity
logger = logging.getLogger(__name__)
logger.setLevel('DEBUG')
# send messages
logger.debug("debug message")
logger.info("so much info")
logger.warning("you've veen warned!")
logger.error("bad news")
logger.critical("really bad news")
2021-09-02 18:18:27,397 - __main__ - DEBUG - debug message
2021-09-02 18:18:27,397 - __main__ - INFO - so much info
2021-09-02 18:18:27,398 - __main__ - WARNING - you've veen warned!
2021-09-02 18:18:27,398 - __main__ - ERROR - bad news
2021-09-02 18:18:27,399 - __main__ - CRITICAL - really bad news
capture logging from other libraries
logging.getLogger('google').setLevel('DEBUG')
from google.cloud import storage
client = storage.Client()
2021-09-02 18:18:27,415 - google.auth._default - DEBUG - Checking None for explicit credentials as part of auth process...
2021-09-02 18:18:27,416 - google.auth._default - DEBUG - Checking Cloud SDK credentials as part of auth process...
2021-09-02 18:18:27,416 - google.auth._default - DEBUG - Cloud SDK credentials not found on disk; not using them
...