Python logging module on Eclipse - python

Where should I see the logging output on Eclipse while debugging ? and when running ?

It will depends on how you configure your logging system. If you use only print statement, it should be shown in the console view of eclipse.
If you use logging and you configured a Console handler it should also be displayed in the console eclipse view.
If you configured only file handler in the logging configuration, you'll have to tail the log files ;)

Here an example that worked for me:
# LOG
log = logging.getLogger("appname")
log.setLevel(logging.DEBUG)
ch = logging.StreamHandler()
ch.setLevel(logging.DEBUG)
fmt = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
ch.setFormatter(fmt)
log.addHandler(ch)
log.debug("something happened") # Will print on console '2013-01-26 12:58:28,974 - appname - DEBUG - something happened'

Related

Save logging DEBUG and show only logging INFO Python

I would like to save in superwrapper.log all logging lines but only show in console the INFO.
If I remove the # of filename line , the file is okey but I don't see anything in the console.
if __name__ == '__main__':
logging.basicConfig(
#filename='superwrapper.log',
level=logging.DEBUG,
format='%(asctime)s.%(msecs)03d %(levelname)s %(module)s - %(funcName)s: %(message)s',
datefmt='%Y-%m-%d %H:%M:%S'
)
2020-04-28 11:41:09.698 INFO common - handle_elasticsearch: Elastic connection detected
2020-04-28 11:41:09.699 INFO superwrapper - <module>: Cookies Loaded: |TRUE|
2020-04-28 11:41:09.715 DEBUG connectionpool - _new_conn: Starting new HTTPS connection (1): m.facebook.com:443
You can use multiple handlers. logging.basicConfig can accept handlers as an argument starting in Python 3.3. One is required for logging to the log file and one to the console. You can also set the handlers to have different logging levels. The simplest way I can think of is to do this:
import logging
import sys
file_handler = logging.FileHandler('superwrapper.log')
console_handler = logging.StreamHandler(sys.stdout)
console_handler.setLevel(logging.INFO)
logging.basicConfig(
level=logging.DEBUG,
format='%(asctime)s.%(msecs)03d %(levelname)s %(module)s - %(funcName)s: %(message)s',
datefmt='%Y-%m-%d %H:%M:%S',
handlers=[
file_handler,
console_handler
]
)
One thing to note is the StreamHandler writes to strerr. Usually you will want to override this with sys.stdout
You can set up multiple loggers. This will get rid of the DEBUG messages, but note that messages of a higher severity will still be broadcast (e.g. 'WARNING' and 'ERROR').
This exact scenario is in the logging cookbook of the Python docs:
import logging
# set up logging to file - see previous section for more details
logging.basicConfig(level=logging.DEBUG,
format='%(asctime)s %(name)-12s %(levelname)-8s %(message)s',
datefmt='%m-%d %H:%M',
filename='/temp/myapp.log',
filemode='w')
# define a Handler which writes INFO messages or higher to the sys.stderr
console = logging.StreamHandler()
console.setLevel(logging.INFO)
# set a format which is simpler for console use
formatter = logging.Formatter('%(name)-12s: %(levelname)-8s %(message)s')
# tell the handler to use this format
console.setFormatter(formatter)
# add the handler to the root logger
logging.getLogger('').addHandler(console)
# Now, we can log to the root logger, or any other logger. First the root...
logging.info('Jackdaws love my big sphinx of quartz.')
# Now, define a couple of other loggers which might represent areas in your
# application:
logger1 = logging.getLogger('myapp.area1')
logger2 = logging.getLogger('myapp.area2')
logger1.debug('Quick zephyrs blow, vexing daft Jim.')
logger1.info('How quickly daft jumping zebras vex.')
logger2.warning('Jail zesty vixen who grabbed pay from quack.')
logger2.error('The five boxing wizards jump quickly.')
In the example given by the cookbook, you should see in the console all the 'INFO', 'WARNING' and 'ERROR' messages, but only the log file will hold the 'DEBUG' message.
#Alan's answer was great, but it also wrote the message from the root logger, which was too much for me, because I was afraid that the size of log file would get out of control. So I modified it like below to only log what I specified and nothing extra from the imported modules.
import logging
# set up logging to file - see previous section for more details
logging.basicConfig(format='%(asctime)s %(name)-12s %(levelname)-8s %(message)s',
datefmt='%m-%d %H:%M',
filename='/temp/myapp.log',
filemode='w')
logger = logging.getLogger('myapp')
logger.setLevel(logging.DEBUG)
# define a Handler which writes INFO messages or higher to the sys.stderr
console = logging.StreamHandler()
console.setLevel(logging.INFO)
# set a format which is simpler for console use
formatter = logging.Formatter('%(name)-12s: %(levelname)-8s %(message)s')
# tell the handler to use this format
console.setFormatter(formatter)
# add the handler to your logger
logger.addHandler(console)
Then in my app, I logged like below just using logger (without any numbers).
logger.debug('Quick zephyrs blow, vexing daft Jim.')
logger.info('How quickly daft jumping zebras vex.')
logger.warning('Jail zesty vixen who grabbed pay from quack.')
logger.error('The five boxing wizards jump quickly.')

Python TimedRotatingFileHandler not writing to file

I have a TimedRotatingFileHandler configured and I'm trying to get it to write to a file.
if __name__ == '__main__':
logger = logging.getLogger('testlog')
logHandler = TimedRotatingFileHandler(filename="../logfile.log", when="midnight")
logHandler.setLevel(logging.INFO)
logFormatter = logging.Formatter('%(asctime)s %(name)-12s %(levelname)-8s %(message)s')
logHandler.setFormatter(logFormatter)
logger.addHandler(logHandler)
print(logger.handlers)
logger.info('hello')
The file gets created, but my statements aren't getting written. I've tried running this both in the IDE and from the command line.
I also printed logger.handlers and got
[<TimedRotatingFileHandler [MY_DIRECTORY]/logfile.log (INFO)>]
so I have no idea what's wrong. Any suggestions?
Figured it out. I was setting the level on the log handler:
loggerHandler.setLevel(logging.INFO)
instead of the Logger
logger.setLevel(logging.INFO)

How to direct one logging to one file and rest to other in python with filtering

I have been googling since quite time to separate out selenium log (Automation API) debug info with application (My logging info) in two different file but automation api log is also coming on my application log file.
I tried following approach (I tried commented line too):
def get_selenium_logger():
logger=logging.getLogger('selenium.webdriver.remote.remote_connection')
fh = logging.FileHandler('results/selenium_log.log', delay=True)
fh.setLevel(logging.DEBUG)
logger.addHandler(fh)
return logger
def get_application_logger():
logger=logging.getLogger()
logging.basicConfig(level=logging.DEBUG)
fh = logging.FileHandler('results/automation_log.log', delay=True)
fh.setLevel(logging.DEBUG)
formatter=logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(module)s - %(funcName)s - %(lineno)d - %(message)s')
fh.setFormatter(formatter)
#logger.removeFilter(logging.Filter('selenium.webdriver.remote.remote_connection'))
logger.addHandler(fh)
return logger
def my_automation_code():
get_selenium_logger()
app_logger = get_application_logger()
app_logger.info("Test **************")
debug log from automation api (selenium) also listed on "automation_log.log", How can I filter it?
Instead of using the root logger for your application, using a logger whose name doesn't start with "selenium." to separate the two logs. For example:
def get_application_logger():
logger=logging.getLogger('myapp')
# rest of the stuff as in your snippet
Update: The name of your application is whatever you decide. In a module which is imported, you can use __name__ as the logger name; in a script, it can be whatever you like, e.g. the basename of the script.

Get Output From the logging Module in IPython Notebook

When I running the following inside IPython Notebook I don't see any output:
import logging
logging.basicConfig(level=logging.DEBUG)
logging.debug("test")
Anyone know how to make it so I can see the "test" message inside the notebook?
Try following:
import logging
logger = logging.getLogger()
logger.setLevel(logging.DEBUG)
logging.debug("test")
According to logging.basicConfig:
Does basic configuration for the logging system by creating a
StreamHandler with a default Formatter and adding it to the root
logger. The functions debug(), info(), warning(), error() and
critical() will call basicConfig() automatically if no handlers are
defined for the root logger.
This function does nothing if the root logger already has handlers
configured for it.
It seems like ipython notebook call basicConfig (or set handler) somewhere.
If you still want to use basicConfig, reload the logging module like this
from importlib import reload # Not needed in Python 2
import logging
reload(logging)
logging.basicConfig(format='%(asctime)s %(levelname)s:%(message)s', level=logging.DEBUG, datefmt='%I:%M:%S')
My understanding is that the IPython session starts up logging so basicConfig doesn't work. Here is the setup that works for me (I wish this was not so gross looking since I want to use it for almost all my notebooks):
import logging
logger = logging.getLogger()
fhandler = logging.FileHandler(filename='mylog.log', mode='a')
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
fhandler.setFormatter(formatter)
logger.addHandler(fhandler)
logger.setLevel(logging.DEBUG)
Now when I run:
logging.error('hello!')
logging.debug('This is a debug message')
logging.info('this is an info message')
logging.warning('tbllalfhldfhd, warning.')
I get a "mylog.log" file in the same directory as my notebook that contains:
2015-01-28 09:49:25,026 - root - ERROR - hello!
2015-01-28 09:49:25,028 - root - DEBUG - This is a debug message
2015-01-28 09:49:25,029 - root - INFO - this is an info message
2015-01-28 09:49:25,032 - root - WARNING - tbllalfhldfhd, warning.
Note that if you rerun this without restarting the IPython session it will write duplicate entries to the file since there would now be two file handlers defined
Bear in mind that stderr is the default stream for the logging module, so in IPython and Jupyter notebooks you might not see anything unless you configure the stream to stdout:
import logging
import sys
logging.basicConfig(format='%(asctime)s | %(levelname)s : %(message)s',
level=logging.INFO, stream=sys.stdout)
logging.info('Hello world!')
What worked for me now (Jupyter, notebook server is: 5.4.1, IPython 7.0.1)
import logging
logging.basicConfig()
logger = logging.getLogger('Something')
logger.setLevel(logging.DEBUG)
Now I can use logger to print info, otherwise I would see only message from the default level (logging.WARNING) or above.
You can configure logging by running %config Application.log_level="INFO"
For more information, see IPython kernel options
I wanted a simple and straightforward answer to this, with nicely styled output so here's my recommendation
import sys
import logging
logging.basicConfig(
format='%(asctime)s [%(levelname)s] %(name)s - %(message)s',
level=logging.INFO,
datefmt='%Y-%m-%d %H:%M:%S',
stream=sys.stdout,
)
log = logging.getLogger('notebook')
Then you can use log.info() or any of the other logging levels anywhere in your notebook with output that looks like this
2020-10-28 17:07:08 [INFO] notebook - Hello world
2020-10-28 17:12:22 [INFO] notebook - More info here
2020-10-28 17:12:22 [INFO] notebook - And some more
As of logging version 3.8 a force parameter has been added that removes any existing handlers, which allows basicConfig to work. This worked on IPython version 7.29.0 and Jupyter Lab version 3.2.1.
import logging
logging.basicConfig(level=logging.DEBUG,
force = True)
logging.debug("test")
I setup a logger for both file and I wanted it to show up on the notebook. Turns out adding a filehandler clears out the default stream handlder.
logger = logging.getLogger()
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
# Setup file handler
fhandler = logging.FileHandler('my.log')
fhandler.setLevel(logging.DEBUG)
fhandler.setFormatter(formatter)
# Configure stream handler for the cells
chandler = logging.StreamHandler()
chandler.setLevel(logging.DEBUG)
chandler.setFormatter(formatter)
# Add both handlers
logger.addHandler(fhandler)
logger.addHandler(chandler)
logger.setLevel(logging.DEBUG)
# Show the handlers
logger.handlers
# Log Something
logger.info("Test info")
logger.debug("Test debug")
logger.error("Test error")
setup
import logging
# make a handler
handler = logging.StreamHandler()
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
handler.setFormatter(formatter)
# add it to the root logger
logging.getLogger().addHandler(handler)
log from your own logger
# make a logger for this notebook, set verbosity
logger = logging.getLogger(__name__)
logger.setLevel('DEBUG')
# send messages
logger.debug("debug message")
logger.info("so much info")
logger.warning("you've veen warned!")
logger.error("bad news")
logger.critical("really bad news")
2021-09-02 18:18:27,397 - __main__ - DEBUG - debug message
2021-09-02 18:18:27,397 - __main__ - INFO - so much info
2021-09-02 18:18:27,398 - __main__ - WARNING - you've veen warned!
2021-09-02 18:18:27,398 - __main__ - ERROR - bad news
2021-09-02 18:18:27,399 - __main__ - CRITICAL - really bad news
capture logging from other libraries
logging.getLogger('google').setLevel('DEBUG')
from google.cloud import storage
client = storage.Client()
2021-09-02 18:18:27,415 - google.auth._default - DEBUG - Checking None for explicit credentials as part of auth process...
2021-09-02 18:18:27,416 - google.auth._default - DEBUG - Checking Cloud SDK credentials as part of auth process...
2021-09-02 18:18:27,416 - google.auth._default - DEBUG - Cloud SDK credentials not found on disk; not using them
...

Logging messages using the stream handler appear red in console

I've setup a logger using the following code
def setup_logging():
import logging
import logging.handlers
import os
#from time import gmtime, strftime
#import logging.handlers
logger = logging.getLogger('apt')
logger.setLevel(logging.DEBUG)
# create file handler
fh = logging.handlers.RotatingFileHandler(os.path.join('..','logs','apt.log'), maxBytes=1000000, backupCount=5)
fh.setLevel(logging.DEBUG)
# create console handler
ch = logging.StreamHandler()
ch.setLevel(logging.DEBUG)
# create formatter and add it to the handlers
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
ch.setFormatter(formatter)
fh.setFormatter(formatter)
# add the handlers to logger
logger.addHandler(ch)
logger.addHandler(fh)
The prints the log messages to both a file and the console (which is what I was after). The only issue is that the console message are red. This is distracting since red makes everything look like an error (when it just info). How can I change it so that the console messages are a different color?
Ideally, black for debug and info, red for warning and above.
I'm using Eclipse and PyDev.
The PyDev console highlights messages to stderr in red by default. Python's logging.DEBUG will send messages to stderr. If you wish to change this behavior, see this post: Logging, StreamHandler and standard streams
To change the colors in PyDev, see here: http://pydev.org/manual_adv_interactive_console.html

Categories