I have this formatter for django
FORMAT = "[%(filename)s:%(lineno)s - %(funcName)20s() ] %(message)s"
The file name i get is
views.py
Now that is confusing as its hard to see from which module is that views.py.
Is there any to get appname in logger formatter
Use pathname instead of filename in your logging configuration.
FORMAT = "[%(pathname)s:%(lineno)s - %(funcName)20s() ] %(message)s"
There are also other variables you can use — check the logging module documentation for a list.
Note that if you're acquiring a Logger instance using logger = logging.getLogger(__name__) (which is a common way to do it), you can also retrieve the module name (e.g. myapp.views) using name.
This is (arguably) better practice but will not work if you're doing e.g. logger = logging.getLogger("mylogger") or logger = logging.getLogger()
Related
I have a Python library which I want to use structured logging for. I've been targeting python-json-logger, which automatically converts a dict passed to the extra kwarg of a logger to JSON:
In [1]: import logging
In [2]: from pythonjsonlogger import jsonlogger
In [3]: logger = logging.getLogger()
In [4]: handler = logging.StreamHandler()
In [5]: formatter = jsonlogger.JsonFormatter()
In [6]: handler.setFormatter(formatter)
In [7]: logger.addHandler(handler)
In [8]: logger.warning("test", extra={"username": "jashugan"})
{"message": "test", "username": "jashugan"}
This means that if the application using my library would like JSON-formatted logs, they can configure their logger like above.
However, if the application using my library doesn't do any configuration, then the information I'm storing in extra gets completely lost:
In [1]: import logging
In [2]: logger = logging.getLogger()
In [3]: logger.warning("test", extra={"username": "jashugan"})
test
Is there a way that I could use extra to create structured logs and have the information show up automatically if the application doesn't configure logging to render it specifically?
I want to make the log file output into daily folder in python.
I can make the log path in hander like "../myapp/logs/20150514/xx.log" through current date.
But the problem is that the log path doesn't change when the date changes.
I create the log instance while i start my long-running python script xx.py, and now the instance's log path is "../myapp/logs/20150514/xx.log". But on tomorrow, as the instance is not changed, so its path is still "../myapp/logs/20150514/xx.log" which should be "../myapp/logs/20150515/xx.log".
How can i make the log output into daily folder?
My get log instance codes:
import os
import utils
import logging
from logging.handlers import RotatingFileHandler
import datetime
def getInstance(file=None):
global logMap
if file is None:
file = 'other/default.log'
else:
file = file + '.log'
if(logMap.has_key(file)):
return logMap.get(file)
else:
visit_date = datetime.date.today().strftime('%Y-%m-%d')
date_file = os.path.join(visit_date,file)
log_path = utils.read_from_ini('log_path').strip()
log_path = os.path.join(log_path,date_file);
if not os.path.isdir(os.path.dirname(log_path)):
os.makedirs(os.path.dirname(log_path))
logging.basicConfig(datefmt='%Y-%m-%d %H:%M:%S',level=logging.INFO)
log_format = '[%(asctime)s][%(levelname)s]%(filename)s==> %(message)s'
formatter = logging.Formatter(log_format)
log_file = RotatingFileHandler(log_path, maxBytes=10*1024*1024,backupCount=5)
log_file.setLevel(logging.INFO)
log_file.setFormatter(formatter)
instance = logging.getLogger(file)
instance.addHandler(log_file)
logMap[file] = instance
return instance
Your RotatingFileHandler doesn't rotate on a time basis, but rather a size basis. That's what the maxBytes argument is for. If you want to rotate based on time, use a TimedRotatingFileHandler instead. Note that this works with filenames, but not paths (as far as I know). You can have 20150505.log, 20150506.log, but not 20150505/mylog.log, 20150506/mylog.log.
If you want to rotate folder names you could probably do it by subclassing the TimedRotatingFileHandler and adding your own logic.
It is possible to get a logger by module name. Like this:
logging.getLogger(module_name)
I would like to add module_name to every log record. Is it possible to set up a Formatter object which adds module_name?
You are looking for the %(name)s parameter; add that to your formatter pattern:
FORMAT = "%(name)s: %(message)s"
logging.basicConfig(format=FORMAT)
or when creating a Formatter():
FORMAT = "%(name)s: %(message)s"
formatter = logging.Formatter(fmt=FORMAT)
See the LogRecord attributes reference:
Attribute name: name
Format: %(name)s
Description: Name of the logger used to log the call.
when you initialize the logger (only need to do this once for the app) try this config
logging.basicConfig(
filename='var/bpextract.log',
level=logging.INFO,
format='%(asctime)s %(process)-7s %(module)-20s %(message)s',
datefmt='%m/%d/%Y %H:%M:%S'
)
...later in your code...
log = logging.getLogger("bpextract")
log.info('###### Starting BPExtract App #####')
In logging.basicConfig, you can specify the format:
logging.basicConfig(format='%(name)s\t%(message)s')
Currently I have everything getting logged to one logfile but I want to separate it out to multiple log files. I look at the logging in python documentation but they don't discuss about this.
log_format = "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
logging.basicConfig(filename=(os.path.join(OUT_DIR, + '-user.log')),
format=log_format, level=logging.INFO, datefmt='%Y-%m-%d %H:%M:%S')
Currently this is how I do the logging. what I want to do have different type of errors or information get log into different log files. At the moment when I do logging.info('Logging IN') and logging.error('unable to login') will go to same logfile. I want to seperate them. Do I need to create another logging object to support the logging into another file?
What you /could/ do (I haven't dug into the logging module too much so there may be a better way to do this) is maybe use a stream rather than a file object:
In [1]: class LogHandler(object):
...: def write(self, msg):
...: print 'a :%s' % msg
...: print 'b :%s' % msg
...:
In [3]: import logging
In [4]: logging.basicConfig(stream=LogHandler())
In [5]: logging.critical('foo')
a :CRITICAL:root:foo
b :CRITICAL:root:foo
In [6]: logging.warn('bar')
a :WARNING:root:bar
b :WARNING:root:bar
Edit with further handling:
Assuming your log files already exist, you could do something like this:
import logging
class LogHandler(object):
format = '%(levelname)s %(message)s'
files = {
'ERROR': 'error.log',
'CRITICAL': 'error.log',
'WARN': 'warn.log',
}
def write(self, msg):
type_ = msg[:msg.index(' ')]
with open(self.files.get(type_, 'log.log'), 'r+') as f:
f.write(msg)
logging.basicConfig(format=LogHandler.format, stream=LogHandler())
logging.critical('foo')
This would allow you to split your logging into various files based on conditions in your log messages. If what you're looking for isn't found, it simply defaults to log.log.
I created this solution from docs.python.org/2/howto/logging-cookbook.html
Simply create two logging file handlers, assign their logging level and add them to your logger.
import os
import logging
current_path = os.path.dirname(os.path.realpath(__file__))
logger = logging.getLogger('simple_example')
logger.setLevel(logging.DEBUG)
#to log debug messages
debug_log = logging.FileHandler(os.path.join(current_path, 'debug.log'))
debug_log.setLevel(logging.DEBUG)
#to log errors messages
error_log = logging.FileHandler(os.path.join(current_path, 'error.log'))
error_log.setLevel(logging.ERROR)
logger.addHandler(debug_log)
logger.addHandler(error_log)
logger.debug('This message should go in the debug log')
logger.info('and so should this message')
logger.warning('and this message')
logger.error('This message should go in both the debug log and the error log')
If I want the access log for Cherrypy to only get to a fixed size, how would I go about using rotating log files?
I've already tried http://www.cherrypy.org/wiki/Logging, which seems out of date, or has information missing.
Look at http://docs.python.org/library/logging.html.
You probably want to configure a RotatingFileHandler
http://docs.python.org/library/logging.html#rotatingfilehandler
I've already tried http://www.cherrypy.org/wiki/Logging, which seems
out of date, or has information missing.
Try adding:
import logging
import logging.handlers
import cherrypy # you might have imported this already
and instead of
log = app.log
maybe try
log = cherrypy.log
The CherryPy documentation of the custom log handlers shows this very example.
Here is the slightly modified version that I use on my app:
import logging
from logging import handlers
def setup_logging():
log = cherrypy.log
# Remove the default FileHandlers if present.
log.error_file = ""
log.access_file = ""
maxBytes = getattr(log, "rot_maxBytes", 10000000)
backupCount = getattr(log, "rot_backupCount", 1000)
# Make a new RotatingFileHandler for the error log.
fname = getattr(log, "rot_error_file", "log\\error.log")
h = handlers.RotatingFileHandler(fname, 'a', maxBytes, backupCount)
h.setLevel(logging.DEBUG)
h.setFormatter(cherrypy._cplogging.logfmt)
log.error_log.addHandler(h)
# Make a new RotatingFileHandler for the access log.
fname = getattr(log, "rot_access_file", "log\\access.log")
h = handlers.RotatingFileHandler(fname, 'a', maxBytes, backupCount)
h.setLevel(logging.DEBUG)
h.setFormatter(cherrypy._cplogging.logfmt)
log.access_log.addHandler(h)
setup_logging()
Cherrypy does its logging using the standard Python logging module. You will need to change it to use a RotatingFileHandler. This handler will take care of everything for you including rotating the log when it reaches a set maximum size.