I have to run an executable for which input parameters are saved in a text file, say input.txt. The output is then redirected to a text file, say output.txt. In windows terminal, I use the command,
executable.exe < input.txt > output.txt
How can I do this from within a python program?
I understand that this can be achieved using os.system. But I wanted to run the same using subprocess module. I was trying something like this,
input_path = '<'+input+'>'
temp = subprocess.call([exe_path, input_path, 'out.out'])
But, the python code executes the exe file without directing the text file to it.
To redirect input/output, use the stdin and stdout parameters of call:
with open(input_path, "r") as input_fd, open("out.out", "w") as output_fd:
temp = subprocess.call([exe_path], stdin=input_fd, stdout=output_fd)
Related
I would like to save the input code and the output result into a file. For example the following python code code.py:
print(2+2)
print(3+2)
to create a code-and-output.txt:
>>> print(2+2)
4
>>> print(3+2)
5
But I can not get it working. Basically, I want to code-and-output.txt to capture what would happen if I run interpreted python and run statements in python interactive environment (code + output).
Ways that I have tried so far:
Redirect stdout:
python code.py > code-and-output.txt
It only saves the output.
Redirect stdout and stdin:
python < code.py > code-and-output.txt
It does the same (only output).
nohup
nohup python code.py
The same problem: only output.
Script
script -q code-and-output.txt
python
print(2+2)
print(2+3)
ctr+d
ctr+d
It works but I need to do it manually. Moreover, it saves some garbage that I can not make them quiet with -q.
Bash Script
# bash-file.sh
python &
print(2+2)
print(2+3)
Does not work: commands run in console bash, not python. It does not work with & either: never ends python repl.
Using tty
open another terminal like /dev/pts/2 and send above bash-file.sh
cat bash-file.sh > /dev/pts/2
It just copies but does not run.
I am not interested in solutions like Jupyter and iPython. They have their own problems that does not address my requirement.
Any solution through linux commands (preferably) or python? Thank you.
Save this is as repl_sim.py in the same directory as your code.py:
with open("code.py", 'r') as input_file:
for line in input_file:
print(f">>> {line.strip()}")
eval(line)
Then run in your terminal with the following if you want to redirect the output to a text file named code-and-output.txt:
python repl_sim.py > code-and-output.txt
-OR-
Then run in your terminal with the following if you want to see the output as well as the make the text file matching:
python repl_sim.py | tee code-and-output.txt
It at least works for the example you provided as code.py.
Pure Python version of first option above so that you don't need shell redirect.
Save this code as repl_sim.py:
import contextlib
with open('code-and-output.txt', 'w') as f:
with contextlib.redirect_stdout(f):
with open("code.py", 'r') as input_file:
for line in input_file:
print(f">>> {line.strip()}")
eval(line)
Then run in your terminal with:
python repl_sim.py
That will result in code-and-output.txt with your desired content.
Contextlib use based on Raymond Hettinger's August 17th 2018 Tweet and contextlib.redirect_stdout() documentation
.
I have a large python script. It prints a bunch of output on my terminal console. The problem is the print is not happening altogether. Some print statements print one blob of statements together, then under that some other part of code prints some stuff. It goes on as long as the main loop runs.
Issue is I get the output as I want but all is getting printed on console as that is where we are running the python main script.
It would be very helpful if along with the print happening at console, I can get all the output in console in same format to a text file also for retention.
Again, there are bunch of print statements occurring in different parts of the whole script. So not sure how to retain the whole output of console in same format to a final text file.
If you want to do the redirection within the Python script, setting sys.stdout to a file object does the trick:
import sys
sys.stdout = open('file', 'w')
print('test')
A far more common method is to use shell redirection when executing (same on Windows and Linux):
$ python foo.py > file
Check this thread Redirect stdout to a file in Python?
Custom Print function for both console and file, replace all print with printing in the code.
outputFile = open('outputfile.log', 'w')
def printing(text):
print(text)
if outputFile:
outputFile.write(str(text))
you have to add file argument to the print() function
print('whatever', file = file_name)
I would rather go ahead with bash and use tee command. It redirects the output to a file too.
python -u my.py | tee my_file.txt
If your python script is file.py, Then use :
python3 file.py > output.txt
Or
python file.py > output.txt
Depending on your python version. This statement (>) will all the outputs of the program into the stdout to the file, output.txt
EDIT :
python3 file.py > output.txt;cat output.txt
The above line can be used to print the file output.txt after the program execution.
EDIT2 :
Another possible option to use a custom print function :
f = open('output.txt')
def custom_print(e = '\n',*s)
for i in s[:-1]:
print(i,end=' ')
print(s[-1],end = e)
f.write(s)
#Your code
#
f.close()
I have a python program, which is supposed to calculate changes based on a value written in a temporary file (eg. "12345\n"). It is always an integer.
I have tried different methods to read the file, but python wasn't able to read it. So then I had the idea to execute a shell command ("cat") that will return content. When I execute this in the shell it works fine, but python the feedback I get is empty. Then I tried writing a bash and then a php skript, which would read the file and then return the value. In python I called them over the shell and the feedback I get is empty as well.
I was wondering if that was a general problem in python and made my scripts return the content of other temporary files, which worked fine.
Inside my scripts I was able to do calculations with the value and in the shell the output is exactly as expected, but not when called via python. I also noticed that I don't get the value with my extra scripts when they are called by phython (I tried to write it into another file; it was updated but empty).
The file I am trying to read is in the /tmp directory and is written into serveral time per second by another script.
I am looking for a solution (open for new ideas) in which I end up having the value of the file in a python variable.
Thanks for the help
Here are my programs:
python:
# python script
import subprocess
stdout = subprocess.Popen(["php /path/to/my/script.php"], shell = True, stdout = subprocess.PIPE).communicate()[0].decode("utf-8")
# other things I tried
#with open("/tmp/value.txt", "r") as file:
# stdout = file.readline() # output = "--"
#stdout = os.popen("cat /tmp/value.txt").read() # output = "--"
#stdout = subprocess.check_output(["php /path/to/my/script.php"], shell = True, stdout = subprocess.PIPE).decode("utf-8") # output = "--"
print(str("-" + stdout + "-")) # output = "--"
php:
# php script
valueFile = fopen("/tmp/value.txt", "r");
value = trim(fgets($valueFile), "\n");
fclose($valueFile);
echo $value; # output in the shell is the value of $value
Edit: context: my python script is started by another python script, which listens for commands from an apache server on the pi. The value I want to read comes from a "1wire" device that listens for S0-signals.
Hi I'm just starting to use python scripts to run executable files. What I basically want to do is use python to edit a input.dat file and run an executable myfile which takes input.dat as the input parameters and saves the results from the output result.dat file somewhere and runs this entire in a loop for varying input.dat
I've figured out the editing part, but the running and taking input part is what I can't seem to figure out.
On the terminal it would look like
sudo ./myfile < input.dat
You could use subprocess.run() to execute the command with input from a file and redirect its output to another file:
import subprocess
for filename in 'input.dat', 'otherinput.dat', 'moreinput.dat':
with open(filename) as infile, open('result_{}'.format(filename), 'w') as outfile:
result = subprocess.run(['sudo', 'myfile'], stdin=infile, stdout=outfile)
I have a python script that currently outputs a bunch of text to stdout (the terminal). It looks something like:
print "ABC"
print "CD"
print "QR"
methodCallToOtherCodeThatAlsoPrints()
# A bunch of other print lines...
Right now all of this output just goes to stdout; however, I want the output to be written to a file in addition to writing to the terminal. What is the best way to do this? Ideally I don't want to have to change all of the print statements (to another method call for instance), and I don't have access to some of the code is called to do the printing.
I was thinking, is there someway I can redirect print output to both stdout AND a file?
open a file and write your string in it:
with open('my_file' , 'w') as f:
f.write('your_string')
and if you want to redirect your output to a file use > in terminal after invoke the .py file :
$~ python my_file.py > save_file.txt