Output to command line and write to file in python script - python

I have a python script that currently outputs a bunch of text to stdout (the terminal). It looks something like:
print "ABC"
print "CD"
print "QR"
methodCallToOtherCodeThatAlsoPrints()
# A bunch of other print lines...
Right now all of this output just goes to stdout; however, I want the output to be written to a file in addition to writing to the terminal. What is the best way to do this? Ideally I don't want to have to change all of the print statements (to another method call for instance), and I don't have access to some of the code is called to do the printing.
I was thinking, is there someway I can redirect print output to both stdout AND a file?

open a file and write your string in it:
with open('my_file' , 'w') as f:
f.write('your_string')
and if you want to redirect your output to a file use > in terminal after invoke the .py file :
$~ python my_file.py > save_file.txt

Related

How to put the output of ffmpeg into a pipe in Python? [duplicate]

I can successfully redirect my output to a file, however this appears to overwrite the file's existing data:
import subprocess
outfile = open('test','w') #same with "w" or "a" as opening mode
outfile.write('Hello')
subprocess.Popen('ls',stdout=outfile)
will remove the 'Hello' line from the file.
I guess a workaround is to store the output elsewhere as a string or something (it won't be too long), and append this manually with outfile.write(thestring) - but I was wondering if I am missing something within the module that facilitates this.
You sure can append the output of subprocess.Popen to a file, and I make a daily use of it. Here's how I do it:
log = open('some file.txt', 'a') # so that data written to it will be appended
c = subprocess.Popen(['dir', '/p'], stdout=log, stderr=log, shell=True)
(of course, this is a dummy example, I'm not using subprocess to list files...)
By the way, other objects behaving like file (with write() method in particular) could replace this log item, so you can buffer the output, and do whatever you want with it (write to file, display, etc) [but this seems not so easy, see my comment below].
Note: what may be misleading, is the fact that subprocess, for some reason I don't understand, will write before what you want to write. So, here's the way to use this:
log = open('some file.txt', 'a')
log.write('some text, as header of the file\n')
log.flush() # <-- here's something not to forget!
c = subprocess.Popen(['dir', '/p'], stdout=log, stderr=log, shell=True)
So the hint is: do not forget to flush the output!
Well the problem is if you want the header to be header, then you need to flush before the rest of the output is written to file :D
Are data in file really overwritten? On my Linux host I have the following behavior:
1) your code execution in the separate directory gets:
$ cat test
test
test.py
test.py~
Hello
2) if I add outfile.flush() after outfile.write('Hello'), results is slightly different:
$ cat test
Hello
test
test.py
test.py~
But output file has Hello in both cases. Without explicit flush() call stdout buffer will be flushed when python process is terminated.
Where is the problem?

How to save Python script console output to a file?

I want to run a Python script and save its console output to a file, while still being able to see the output in the console. For example, a "Hello World" script as simple as print('Hello World') would show Hello World in the console and also save this output to a file.
I was previously using pythonscript.py > console_output.txt to shell redirect, I can't see anything in the console with this solution (and for some reason, it no longer saves anything to the specified file, no idea why). This is a solution using Linux shell redirection, but now I would like to write a separate python script to do it.
I don't think I need any special error logging stuff. I'm currently using try: except blocks and just printing the Exception and then using that to find the error.
You can do you something like this:
def custom_print(message_to_print, log_file='output.txt'):
print(message_to_print)
with open(log_file, 'a') as of:
of.write(message_to_print + '\n')
This way you can use custom_print instead of print to be able to both see the result in the console and have the result appended to a log file.
Try tee command, for example: pythonscript.py | tee console_output.txt

How to store output printed on terminal console to a text file in Python

I have a large python script. It prints a bunch of output on my terminal console. The problem is the print is not happening altogether. Some print statements print one blob of statements together, then under that some other part of code prints some stuff. It goes on as long as the main loop runs.
Issue is I get the output as I want but all is getting printed on console as that is where we are running the python main script.
It would be very helpful if along with the print happening at console, I can get all the output in console in same format to a text file also for retention.
Again, there are bunch of print statements occurring in different parts of the whole script. So not sure how to retain the whole output of console in same format to a final text file.
If you want to do the redirection within the Python script, setting sys.stdout to a file object does the trick:
import sys
sys.stdout = open('file', 'w')
print('test')
A far more common method is to use shell redirection when executing (same on Windows and Linux):
$ python foo.py > file
Check this thread Redirect stdout to a file in Python?
Custom Print function for both console and file, replace all print with printing in the code.
outputFile = open('outputfile.log', 'w')
def printing(text):
print(text)
if outputFile:
outputFile.write(str(text))
you have to add file argument to the print() function
print('whatever', file = file_name)
I would rather go ahead with bash and use tee command. It redirects the output to a file too.
python -u my.py | tee my_file.txt
If your python script is file.py, Then use :
python3 file.py > output.txt
Or
python file.py > output.txt
Depending on your python version. This statement (>) will all the outputs of the program into the stdout to the file, output.txt
EDIT :
python3 file.py > output.txt;cat output.txt
The above line can be used to print the file output.txt after the program execution.
EDIT2 :
Another possible option to use a custom print function :
f = open('output.txt')
def custom_print(e = '\n',*s)
for i in s[:-1]:
print(i,end=' ')
print(s[-1],end = e)
f.write(s)
#Your code
#
f.close()

How do I get this terminal command to be executed from python?

I have to run an executable for which input parameters are saved in a text file, say input.txt. The output is then redirected to a text file, say output.txt. In windows terminal, I use the command,
executable.exe < input.txt > output.txt
How can I do this from within a python program?
I understand that this can be achieved using os.system. But I wanted to run the same using subprocess module. I was trying something like this,
input_path = '<'+input+'>'
temp = subprocess.call([exe_path, input_path, 'out.out'])
But, the python code executes the exe file without directing the text file to it.
To redirect input/output, use the stdin and stdout parameters of call:
with open(input_path, "r") as input_fd, open("out.out", "w") as output_fd:
temp = subprocess.call([exe_path], stdin=input_fd, stdout=output_fd)

Printing an python script output to BOTH a text file and command window

So I know how to execute a python script and have it output in command window using os.command or even subprocess. Also i know that capturing output into a text file is done via os.command('my command > me.txt".
My question here is:
IS there a way to do both of them with one command? ie execute a python script,capture the output to a text file AND have it show on the command window?
If this is not possible here is another question:
the command I want to execute takes up to 8 min to finish and writes up to 300 lines in a text file. Can I access the text file while the command is still executing in order to get some data without waiting for the command to finish executing? like access the text file every 1 minute for example?
If neither is possible then this would also do the job:
when my command executes succesfully it prints out a Done statement on the cmd as well as many other lines. Can i check in the cmd if that "Done" string was printed or it needs to be captured in a text file for that dearch to happen?
The easiest way to save the output of a command while echoing it to stdout is to use tee:
$ python script.py | tee logfile.log
if you want to follow the output of a file while it is being written use tail:
$ tail -f logfile
you might want to unbuffer or flush the output immediately to be able to read the output before a full line or a buffer is filled up:
$ unbuffer python script.py > logfile
$ tail -f logfile
or
print ("to file..", flush = True)
If you can do this from within your script rather than from the command line it would be quite easy.
with open("output.txt", "w") as output:
print>>output, "what you want as output" #prints to output file
print "what you want as output" #prints to screen
The easier way I devised is to create a function that prints to both screen and to file. The example below works when you input the output file name as an argument:
OutputFile= args.Output_File
if args.Output_File:
OF = open(OutputFile, 'w')
def printing(text):
print text
if args.Output_File:
OF.write(text + "\n")
#To print a line_of_text both to screen and file all you need to do is:
printing(line_of_text)

Categories