I have a python script with logging configuration that is called with subprocess.call from a main script. I would like the subprocess to create the log file, and write to it, as it is configured in the subprocess's code. In addition, I would like to see the logs also displayed in the main process' console output.
The following works on mac, but doesn't work on a linux box with python 3.10.4
Main Code
import subprocess
import logging
log_format = '%(asctime)s :: %(name)s :: %(levelname)-8s :: %(message)s'
log_streamhandler = logging.StreamHandler()
logging.basicConfig(format=log_format, level=logging.INFO,
handlers=[log_streamhandler])
logger = logging.getLogger(__name__)
if __name__ == '__main__':
code = subprocess.call(["python3", "my_subprocess.py"])
logger.info(f'Subprocesses completed with code {code}')
my_subprocess.py
import sys
import logging
if __name__ == '__main__':
log_format = '%(asctime)s :: %(name)s :: %(levelname)-8s :: %(message)s'
log_filehandler = logging.FileHandler('my_subprocess.log')
log_streamhandler = logging.StreamHandler(stream=sys.stdout)
logging.basicConfig(format=log_format, level=logging.INFO,
handlers=[log_filehandler, log_streamhandler])
logger = logging.getLogger(__name__)
logger.info('Starting subprocess.')
logger.info('Completed subprocess.')
The reason why I create the log file from the subprocess, is because it also needs to be able to run on its own, and create the log file. Several such subprocesses are called by the main process. How can I restructure to keep these features and run on linux as expected?
Related
I've set my Python program to log output, but although it logs correctly to the console, it does not log the time, log level information etc to the file.
Program:
import time
import logging
from logging.handlers import RotatingFileHandler
logFileName = 'logs.log'
logging.basicConfig(level=logging.INFO, format='%(levelname)s %(asctime)s - %(message)s', datefmt='%d-%b-%y %H:%M:%S')
log = logging.getLogger(__name__)
handler = RotatingFileHandler(logFileName , maxBytes=2000 , backupCount=5)
log.addHandler(handler)
log.setLevel(logging.INFO)
if __name__ == '__main__':
while True:
log.info("program running")
time.sleep(1)
Output to console:
INFO 05-May-22 23:20:54 - program running
INFO 05-May-22 23:20:55 - program running
INFO 05-May-22 23:20:56 - program running
INFO 05-May-22 23:20:57 - program running
INFO 05-May-22 23:20:58 - program running
INFO 05-May-22 23:20:59 - program running
INFO 05-May-22 23:21:00 - program running
Simultaneous output to file logs.log:
program running
program running
program running
program running
program running
program running
program running
How to make the full output go to the log file?
You can separately set the Formatter for the RotatingFileHandler
handler.formatter = logging.Formatter(fmt='%(levelname)s %(asctime)s - %(message)s', datefmt='%d-%b-%y %H:%M:%S')
logger configuration to log to file and print to stdout would not work with me so:
I want to both print logging details in prog_log.txt AND in the console, I have:
# Debug Settings
Import logging
#logging.disable() #when program ready, un-indent this line to remove all logging messages
logger = logging.getLogger('')
logging.basicConfig(level=logging.DEBUG,
filename = 'prog_log.txt',
format='%(asctime)s - %(levelname)s - %(message)s',
filemode = 'w')
logger.addHandler(logging.StreamHandler())
logging.debug('Start of Program')
The above does print into the prog_log.txt file the logging details properly formated but it prints unformated logging messages multiple times as I re-run the program in the console like below..
In [3]: runfile('/Volumes/GoogleDrive/Mon Drive/MAC_test.py', wdir='/Volumes/GoogleDrive/Mon Drive/')
Start of Program
Start of Program
Start of Program #after third running
Any help welcome ;)
Hi I am trying a sample program using logger in python
import logging
import time,sys
import os
logger = logging.getLogger('myapp')
hdlr = logging.FileHandler('myapp1234.log')
formatter = logging.Formatter('%(asctime)s %(levelname)s %(message)s')
hdlr.setFormatter(formatter)
logger.addHandler(hdlr)
logging.getLogger().setLevel(logging.DEBUG)
logger.error('We have a problem')
logger.info('While this is just chatty')
logger.debug("Sample")
hdlr.flush()
time.sleep(10)
logger.error('We have a problem')
logger.info('While this is just chatty')
logger.debug("Sample")
hdlr.close()
This code is not dynamically printing. I tried even handler.flush, sys.exit(0), sys.stdout.
When I try to open a file even by killing I am getting following error. Log is only printing after 120-200 seconds (sometimes it's taking even more).
How can I print immediately (at least by end of program)?
Did I miss any Handel closing.
Try removing the following statement.
time.sleep(10)
If I run my script manually I see runlog.log populated with messages. When i run the script via cron, there's no output in my log file BUT i do see it in /var/spool/mail/root so I know the script works.
file_handler = logging.FileHandler("runlog.log", "a")
file_handler = logging.StreamHandler()
file_handler.setFormatter(default_formatter)
logger = logging.getLogger()
logger.setLevel(logging.DEBUG)
logger.addHandler(file_handler)
...
logger.info("No new files")
How can I put my two processes to log in a only file?
With my code only proc1 is logging to my log file...
module.py:
import multiprocessing,logging
log = multiprocessing.log_to_stderr()
log.setLevel(logging.DEBUG)
handler = logging.FileHandler('/var/log/my.log')
handler.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
handler.setFormatter(formatter)
log.addHandler(handler)
def proc1():
log.info('Hi from proc1')
while True:
if something:
log.info('something')
def proc2():
log.info('Hi from proc2')
while True:
if something_more:
log.info('something more')
if __name__ == '__main__':
p1 = multiprocessing.Process(target=proc1)
p2 = multiprocessing.Process(target=proc2)
p1.start()
p2.start()
As said at https://docs.python.org/2/howto/logging-cookbook.html#logging-to-a-single-file-from-multiple-processes
"Although logging is thread-safe, and logging to a single file from
multiple threads in a single process is supported, logging to a single
file from multiple processes is not supported"
Then, you should find another approach to get it, ie implementing a logging server:
https://docs.python.org/2/howto/logging-cookbook.html#sending-and-receiving-logging-events-across-a-network