I checked out Reading stdout from one program in another program but did not find the answer I was looking for
I'm new to Linux and i'm using the argparse module in Python to run arguments with a program through the terminal on my Mac
I have program_1.py that inputs a file via sys.stdin and outputs data to sys.stdout
I'm trying to get program_2.py to take in this data that was outputted to sys.stdout from program_1.py and take it in as it's sys.stdin
I tried something along the lines of:
Mu$ python program-1.py <sample.txt> program-2.py
For simplicity, let's say that 'sample.txt' just had the string '1.6180339887'
How can program_2.py read from sys.stdout of the previous program as it's sys.stdin?
In this simple example, I am just trying to get program_2.py to output '1.6180339887' to sys.stdout so I can see how this works.
Somebody told me to use the | character for pipelines but I couldn't make it work correctly
Using a pipe is correct:
python program-1.py sample.txt | python program-2.py
Here's a complete example:
$ cat sample.txt
hello
$ cat program-1.py
import sys
print open(sys.argv[1]).read()
$ cat program-2.py
import sys
print("program-2.py on stdin got: " + sys.stdin.read())
$ python program-1.py sample.txt
hello
$ python program-1.py sample.txt | python program-2.py
program-2.py on stdin got: hello
(PS: you could have included a complete test case in your question. That way, people could say what you did wrong instead of writing their own)
program-1.py:
import sys
if len(sys.argv) == 2:
with open(sys.argv[1], 'r') as f:
sys.stdout.write(f.read())
program-2.py:
import sys
ret = sys.stdin.readline()
print ret
sample.txt
1.6180339887
in your shell:
Mu$ python p1.py txt | python p2.py
output:
1.6180339887
more info here:
http://en.wikipedia.org/wiki/Pipeline_(Unix) and here http://en.wikibooks.org/wiki/Python_Programming/Input_and_Output
Related
I would like to save the input code and the output result into a file. For example the following python code code.py:
print(2+2)
print(3+2)
to create a code-and-output.txt:
>>> print(2+2)
4
>>> print(3+2)
5
But I can not get it working. Basically, I want to code-and-output.txt to capture what would happen if I run interpreted python and run statements in python interactive environment (code + output).
Ways that I have tried so far:
Redirect stdout:
python code.py > code-and-output.txt
It only saves the output.
Redirect stdout and stdin:
python < code.py > code-and-output.txt
It does the same (only output).
nohup
nohup python code.py
The same problem: only output.
Script
script -q code-and-output.txt
python
print(2+2)
print(2+3)
ctr+d
ctr+d
It works but I need to do it manually. Moreover, it saves some garbage that I can not make them quiet with -q.
Bash Script
# bash-file.sh
python &
print(2+2)
print(2+3)
Does not work: commands run in console bash, not python. It does not work with & either: never ends python repl.
Using tty
open another terminal like /dev/pts/2 and send above bash-file.sh
cat bash-file.sh > /dev/pts/2
It just copies but does not run.
I am not interested in solutions like Jupyter and iPython. They have their own problems that does not address my requirement.
Any solution through linux commands (preferably) or python? Thank you.
Save this is as repl_sim.py in the same directory as your code.py:
with open("code.py", 'r') as input_file:
for line in input_file:
print(f">>> {line.strip()}")
eval(line)
Then run in your terminal with the following if you want to redirect the output to a text file named code-and-output.txt:
python repl_sim.py > code-and-output.txt
-OR-
Then run in your terminal with the following if you want to see the output as well as the make the text file matching:
python repl_sim.py | tee code-and-output.txt
It at least works for the example you provided as code.py.
Pure Python version of first option above so that you don't need shell redirect.
Save this code as repl_sim.py:
import contextlib
with open('code-and-output.txt', 'w') as f:
with contextlib.redirect_stdout(f):
with open("code.py", 'r') as input_file:
for line in input_file:
print(f">>> {line.strip()}")
eval(line)
Then run in your terminal with:
python repl_sim.py
That will result in code-and-output.txt with your desired content.
Contextlib use based on Raymond Hettinger's August 17th 2018 Tweet and contextlib.redirect_stdout() documentation
.
So I'm making a program where it prints a random string of digits/letters and I want it to print onto the console as well into a text file.
print ( ''.join(random.choice(f"{letters2}{letters}{digits}") for i in range(0, option2)) )
# line above prints into console
if question == 1:
with open('passwords.txt', 'a') as f:
sys.stdout = f
print( ## )
sys.stdout = original_stdout
# I'm trying to get what it outputs to the console to print to the file aswell
Thanks
On Unix systems, you can achieve this by piping the output of your script to the tee command, which displays it in the terminal and also writes it to whatever file name you provide:
$ python3 -c "print('hello world')" | tee out.txt
hello world
$ cat out.txt
hello world
Note that the first command doesn't need to be a Python script - this is a general solution for simultaneously viewing and saving the output of any program.
I have added a bash command to my Python script - this bash command is used to linearise an input file (i.e. take a file with many lines, and merge it into one line). It works when I manually write the name of the file for the command (in this example, the input file is called input.txt):
import subprocess
holder = subprocess.Popen('perl -ne "chomp;print;" input.txt', shell=True, stdout=subprocess.PIPE).stdout.read()
print(holder)
However, I would like this bash command to take the name of the file specified by the user in the command line, and linearise it.
I have already tried to achieve this using %s:
import subprocess
import sys
import os
path = sys.argv[1]
file = open(path, 'r')
inp = file.read()
holder = subprocess.Popen('perl -ne "chomp;print;" %s' % inp, shell=True, stdout=subprocess.PIPE).stdout.read()
print(holder)
However, when I try this, the shell freezes and shows no output, and the bash $ prompt does not show, there is no error message.
I would like to know if there is a valid way to add a user input in this case and would appreciate any help.
EDIT: just to clarify, here is what I type into my shell to run the program with the second example:
$ python3 program.py input.txt
Why not just directly feed the input path to your subprocess, like this? I'm slightly confused as to why you would like to use perl within python do this however..
import subprocess
import sys
import os
path = sys.argv[1]
holder = subprocess.Popen('perl -ne "chomp;print;" %s' % path, shell=True, stdout=subprocess.PIPE).stdout.read()
print(holder)
with this example input:
test.txt
i
i
i
i
i
the program will output:
python linearize.py test.txt
b'iiiii'
import subprocess
import sys
import os
holder = subprocess.Popen('perl -ne "chomp;print;" {}'.format(sys.argv[1]), shell=True, stdout=subprocess.PIPE).stdout.read()
print(holder)
But why ? Why calling a perl script using python ? Python has all the tools you need to do the same using native, debuggable and crossplatform code on one process. It does not make sense and it is a bad practice.
import sys
with open(sys.argv[1]) as input_file:
holder = ''.join(input_file.read().splitlines())
print(holder)
I'm learning python2.7 in Windows 8.1.
I used echo "XXX" to create a new file, and then I used python to read it, but it output unreadable files.
But when I used GUI to type something into a new file then read it in python it works.
Is there anyone could help me please ?
I wrote this in powershell 2.0
PS C:\Users\Fingal> echo "Stack Overflow is such a good site." > test.txt
And then I wrote this in Notepad++, and save it as test.txt in C:\Users\Fingal
from sys import argv
script, file_name = argv
txt = open(file_name)
print txt.read()
txt.close()
Then I back to powershell and type:
PS C:\Users\Fingal\> python test.py test.txt
a ataaacaka a0avaearafalaoawa aiasa asauacaha aaa agaoaoada asaiattaea.aa
It output unreadable code.
Fingal
I am trying to build a python program that follows a log file checks for certain patterns. (Much like grep ..)
Part of the testing code 'test.py' is to read the stdin,
import fileinput
for line in fileinput.input():
print line
so if I do this in one terminal
tail -f log.txt | python test.py
In another terminal
echo "hello" >> log.txt
you expect hello is print out on the first terminal, but it doesn't. How to change the code? I also want to use it like this
cat log.txt | python test.py
with the same test.py.
Echoing sys.stdin directly seems to work on my Mac OS laptop:
import sys
for line in sys.stdin:
print line.rstrip()
But interestingly, this didn't work very well on my Linux box. It would print the output from tail -f eventually, but the buffering was definitely making it appear as though the program was not working (it would print out fairly large chunks after several seconds of waiting).
Instead I got more responsive behavior by reading from sys.stdin one byte at a time:
import sys
buf = ''
while True:
buf += sys.stdin.read(1)
if buf.endswith('\n'):
print buf[:-1]
buf = ''