writing a log file from python program - python

I want to output some strings to a log file and I want the log file to be continuously updated.
I have looked into the logging module pf python and found out that it is
mostly about formatting and concurrent access.
Please let me know if I am missing something or amy other way of doing it

usually i do the following:
# logging
LOG = "/tmp/ccd.log"
logging.basicConfig(filename=LOG, filemode="w", level=logging.DEBUG)
# console handler
console = logging.StreamHandler()
console.setLevel(logging.ERROR)
logging.getLogger("").addHandler(console)
The logging part initialises logging's basic configurations. In the following I set up a console handler that prints out some logging information separately. Usually my console output is set to output only errors (logging.ERROR) and the detailed output in the LOG file.
Your loggings will now printed to file. For instance using:
logger = logging.getLogger(__name__)
logger.debug("hiho debug message")
or even
logging.debug("next line")
should work.
Doug Hellmann has a nice guide.

To add my 10cents with regards to using logging. I've only recently discovered the Logging module and was put off at first. Maybe just because it initially looks like a lot of work, but it's really simple and incredibly handy.
This is the set up that I use. Similar to Mkinds answer, but includes a timestamp.
# Set up logging
log = "bot.log"
logging.basicConfig(filename=log,level=logging.DEBUG,format='%(asctime)s %(message)s', datefmt='%d/%m/%Y %H:%M:%S')
logging.info('Log Entry Here.')
Which will produce something like:
22/09/2015 14:39:34 Log Entry Here.

You can log to a file with the Logging API.
Example: http://docs.python.org/2/howto/logging.html#logging-to-a-file

Related

How to log input and output into a text file in Python?

Is there a way to save the program input and output to a text file? Let's have a really simple application and say I want to log the shell to a "log.txt" file.
What's your name?
Peter
How old are you?
35
Where are your from?
Germany
Hi Peter, you're 35 years old and you're from Germany!
I believe that the logging module could be the way but I don't know how to use it.
In the official documentation there is a Basic Logging Tutorial.
A very simple example would be:
import logging
logging.basicConfig(filename='example.log', encoding='utf-8', level=logging.DEBUG)
logging.debug('This message should go to the log file')
logging.info('So should this')
logging.warning('And this, too')
logging.error('And non-ASCII stuff, too, like Øresund and Malmö')
That's the explanation:
First you import logging, and after that you configure it with basicConfig indicating the file that you want to save the log messages to.
Then, every time you want to add something to the log, you have to use the logging functions: debug(), info(), warning(), error() and critical(), that are named after the level or severity of the events they are used to track.

Loguru - how to clear the logs

is there a way clear the log file before using it with the python log library 'loguru'?
Thanks!
from loguru import logger
# clear the log before logging
logger.add('output.log', format="{time:YYYY-MM-DD HH:mm:ss} {level} {message}")
logger.debug('my message')
Loguru API does not have the ability to remove/clean up log files. Instead of that, you could either use open('output.log', 'w').close() if you want to erase all contents of the log file. Or check the ability to use log rotation in Loguru if it's suitable for your use case (something like that logger.add("file_{time}.log"))

Save each level of logging in a different file

I am working with the Python logging module but I don't know how get writing in different files for each type of logging level.
For example, I want the debug-level messages (and only debug) saved in the file debug.log.
This is the code I have so far, I have tried with handlers but does not work anymore. I don't know If it is possible to do so I wish. This is the code I have using handlers and different files for each level of logging, but it does not work, since the Debug file save messages of all other levels.
import logging as _logging
self.logger = _logging.getLogger(__name__)
_handler_debug = _logging.FileHandler(self.debug_log_file)
_handler_debug.setLevel(_logging.DEBUG)
self.logger.addHandler(_handler_debug)
_handler_info = _logging.FileHandler(self.info_log_file)
_handler_info.setLevel(_logging.INFO)
self.logger.addHandler(_handler_info)
_handler_warning = _logging.FileHandler(self.warning_log_file)
_handler_warning.setLevel(_logging.WARNING)
self.logger.addHandler(_handler_warning)
_handler_error = _logging.FileHandler(self.error_log_file)
_handler_error.setLevel(_logging.ERROR)
self.logger.addHandler(_handler_error)
_handler_critical = _logging.FileHandler(self.critical_log_file)
_handler_critical.setLevel(_logging.CRITICAL)
self.logger.addHandler(_handler_critical)
self.logger.debug("debug test")
self.logger.info("info test")
self.logger.warning("warning test")
self.logger.error("error test")
self.logger.critical("critical test")
And the debug.log contains this:
debug test
info test
warning test
error test
critical test
I'm working with classes, so I have adapted a bit the code
Possible duplicate to: python logging specific level only
Please see the accepted answer there.
You need to use filters for each handler that restrict the output to the log level supplied.

Python Logging to head of logfile?

I use python logging and would like the latest log entry to be at the head of the log file, rather than the tail.
I cannot find anything in https://docs.python.org/2/howto/logging.html that appears useful for appending a logger record at the head of the logfile instead of the tail.
This is my logger: ("Carl" is my GoPiGo3 robot)
import logging
# create logger
logger = logging.getLogger('lifelog')
logger.setLevel(logging.INFO)
loghandler = logging.FileHandler('/home/pi/Carl/life.log')
logformatter = logging.Formatter('%(asctime)s|%(message)s',"%Y-%m-%d %H:%M")
loghandler.setFormatter(logformatter)
logger.addHandler(loghandler)
#logger.info('-------------')
and I log with:
logger.info('<something to log>')
Is there a python logging module intrinsic solution?
Do I have to write my own handler?
The Answer: I should not want to do this.
Because:
Inserting something at the beginning of a file displaces everything
that comes after it, so the bigger the file, the more an insert will
cost. That would go counter to the efforts of the logging module to be
sparing in its use of resources. – BoarGules Mar 28 at 13:56
Using 'tac' to view the log file is a good solution.

Where can I check tornado's log file?

I think there was a default log file, but I didn't find it yet.
Sometimes the HTTP request process would throw an exception on the screen, but I suggest it also goes somewhere on the disk or I wouldn't know what was wrong during a long run test.
P.S.: write an exception handler is another topic; first I'd like to know my question's answer.
I found something here:
https://groups.google.com/forum/?fromgroups=#!topic/python-tornado/px4R8Tkfa9c
But it also didn't mention where can I find those log.
It uses standard python logging module by default.
Here is definition:
access_log = logging.getLogger("tornado.access")
app_log = logging.getLogger("tornado.application")
gen_log = logging.getLogger("tornado.general")
It doesn't write to files by default. You can run it using supervisord and define in supervisord config, where log files will be located. It will capture output of tornado and write it to files.
Also you can think this way:
tornado.options.options['log_file_prefix'].set('/opt/logs/my_app.log')
tornado.options.parse_command_line()
But in this case - measure performance. I don't suggest you to write to files directly from tornado application, if it can be delegated.
FYI: parse_command_line just enables pretty console logging.
With newer versions, you may do
args = sys.argv
args.append("--log_file_prefix=/opt/logs/my_app.log")
tornado.options.parse_command_line(args)
or as #ColeMaclean mentioned, providing
--log_file_prefix=PATH
at command line
There's no logfile by default.
You can use the --log_file_prefix=PATH command line option to set one.
Tornado just uses the Python stdlib's logging module, if you're trying to do anything more complicated.
Use RotatingFileHandler:
import logging
from logging.handlers import RotatingFileHandler
log_path = "/path/to/tornado.access.log"
logger_ = logging.getLogger("tornado.access")
logger_.setLevel(logging.INFO)
logger_.propagate = False
handler = RotatingFileHandler(log_path, maxBytes=1024*1024*1024, backupCount=3)
handler.setFormatter(logging.Formatter("[%(name)s][%(asctime)s][%(levelname)s][%(pathname)s:%(lineno)d] > %(message)s"))
logger_.addHandler(handler)

Categories