I am trying to build a python program that follows a log file checks for certain patterns. (Much like grep ..)
Part of the testing code 'test.py' is to read the stdin,
import fileinput
for line in fileinput.input():
print line
so if I do this in one terminal
tail -f log.txt | python test.py
In another terminal
echo "hello" >> log.txt
you expect hello is print out on the first terminal, but it doesn't. How to change the code? I also want to use it like this
cat log.txt | python test.py
with the same test.py.
Echoing sys.stdin directly seems to work on my Mac OS laptop:
import sys
for line in sys.stdin:
print line.rstrip()
But interestingly, this didn't work very well on my Linux box. It would print the output from tail -f eventually, but the buffering was definitely making it appear as though the program was not working (it would print out fairly large chunks after several seconds of waiting).
Instead I got more responsive behavior by reading from sys.stdin one byte at a time:
import sys
buf = ''
while True:
buf += sys.stdin.read(1)
if buf.endswith('\n'):
print buf[:-1]
buf = ''
Related
I would like to save the input code and the output result into a file. For example the following python code code.py:
print(2+2)
print(3+2)
to create a code-and-output.txt:
>>> print(2+2)
4
>>> print(3+2)
5
But I can not get it working. Basically, I want to code-and-output.txt to capture what would happen if I run interpreted python and run statements in python interactive environment (code + output).
Ways that I have tried so far:
Redirect stdout:
python code.py > code-and-output.txt
It only saves the output.
Redirect stdout and stdin:
python < code.py > code-and-output.txt
It does the same (only output).
nohup
nohup python code.py
The same problem: only output.
Script
script -q code-and-output.txt
python
print(2+2)
print(2+3)
ctr+d
ctr+d
It works but I need to do it manually. Moreover, it saves some garbage that I can not make them quiet with -q.
Bash Script
# bash-file.sh
python &
print(2+2)
print(2+3)
Does not work: commands run in console bash, not python. It does not work with & either: never ends python repl.
Using tty
open another terminal like /dev/pts/2 and send above bash-file.sh
cat bash-file.sh > /dev/pts/2
It just copies but does not run.
I am not interested in solutions like Jupyter and iPython. They have their own problems that does not address my requirement.
Any solution through linux commands (preferably) or python? Thank you.
Save this is as repl_sim.py in the same directory as your code.py:
with open("code.py", 'r') as input_file:
for line in input_file:
print(f">>> {line.strip()}")
eval(line)
Then run in your terminal with the following if you want to redirect the output to a text file named code-and-output.txt:
python repl_sim.py > code-and-output.txt
-OR-
Then run in your terminal with the following if you want to see the output as well as the make the text file matching:
python repl_sim.py | tee code-and-output.txt
It at least works for the example you provided as code.py.
Pure Python version of first option above so that you don't need shell redirect.
Save this code as repl_sim.py:
import contextlib
with open('code-and-output.txt', 'w') as f:
with contextlib.redirect_stdout(f):
with open("code.py", 'r') as input_file:
for line in input_file:
print(f">>> {line.strip()}")
eval(line)
Then run in your terminal with:
python repl_sim.py
That will result in code-and-output.txt with your desired content.
Contextlib use based on Raymond Hettinger's August 17th 2018 Tweet and contextlib.redirect_stdout() documentation
.
So I'm making a program where it prints a random string of digits/letters and I want it to print onto the console as well into a text file.
print ( ''.join(random.choice(f"{letters2}{letters}{digits}") for i in range(0, option2)) )
# line above prints into console
if question == 1:
with open('passwords.txt', 'a') as f:
sys.stdout = f
print( ## )
sys.stdout = original_stdout
# I'm trying to get what it outputs to the console to print to the file aswell
Thanks
On Unix systems, you can achieve this by piping the output of your script to the tee command, which displays it in the terminal and also writes it to whatever file name you provide:
$ python3 -c "print('hello world')" | tee out.txt
hello world
$ cat out.txt
hello world
Note that the first command doesn't need to be a Python script - this is a general solution for simultaneously viewing and saving the output of any program.
I have a large python script. It prints a bunch of output on my terminal console. The problem is the print is not happening altogether. Some print statements print one blob of statements together, then under that some other part of code prints some stuff. It goes on as long as the main loop runs.
Issue is I get the output as I want but all is getting printed on console as that is where we are running the python main script.
It would be very helpful if along with the print happening at console, I can get all the output in console in same format to a text file also for retention.
Again, there are bunch of print statements occurring in different parts of the whole script. So not sure how to retain the whole output of console in same format to a final text file.
If you want to do the redirection within the Python script, setting sys.stdout to a file object does the trick:
import sys
sys.stdout = open('file', 'w')
print('test')
A far more common method is to use shell redirection when executing (same on Windows and Linux):
$ python foo.py > file
Check this thread Redirect stdout to a file in Python?
Custom Print function for both console and file, replace all print with printing in the code.
outputFile = open('outputfile.log', 'w')
def printing(text):
print(text)
if outputFile:
outputFile.write(str(text))
you have to add file argument to the print() function
print('whatever', file = file_name)
I would rather go ahead with bash and use tee command. It redirects the output to a file too.
python -u my.py | tee my_file.txt
If your python script is file.py, Then use :
python3 file.py > output.txt
Or
python file.py > output.txt
Depending on your python version. This statement (>) will all the outputs of the program into the stdout to the file, output.txt
EDIT :
python3 file.py > output.txt;cat output.txt
The above line can be used to print the file output.txt after the program execution.
EDIT2 :
Another possible option to use a custom print function :
f = open('output.txt')
def custom_print(e = '\n',*s)
for i in s[:-1]:
print(i,end=' ')
print(s[-1],end = e)
f.write(s)
#Your code
#
f.close()
Is it possible to allow Python to read from stdin from another source such as a file continually? Basically I'm trying to allow my script to use stdin to echo input and I'd like to use a file or external source to interact with it (while remaining open).
An example might be (input.py):
#!/usr/bin/python
import sys
line = sys.stdin.readline()
while line:
print line,
line = sys.stdin.readline()
Executing this directly I can continuously enter text and it echos back while the script remains alive. If you want to use an external source though such as a file or input from bash then the script exits immediately after receiving input:
$ echo "hello" | python input.py
hello
$
Ultimately what I'd like to do is:
$ tail -f file | python input.py
Then if the file updates have input.py echo back anything that is added to file while remaining open. Maybe I'm approaching this the wrong way or I'm simply clueless, but is there a way to do it?
Use the -F option to tail to make it reopen the file if it gets renamed or deleted and a new file is created with the original name. Some editors write the file this way, and logfile rotation scripts also usually work this way (they rename the original file to filename.1, and create a new log file).
$ tail -F file | python input.py
I checked out Reading stdout from one program in another program but did not find the answer I was looking for
I'm new to Linux and i'm using the argparse module in Python to run arguments with a program through the terminal on my Mac
I have program_1.py that inputs a file via sys.stdin and outputs data to sys.stdout
I'm trying to get program_2.py to take in this data that was outputted to sys.stdout from program_1.py and take it in as it's sys.stdin
I tried something along the lines of:
Mu$ python program-1.py <sample.txt> program-2.py
For simplicity, let's say that 'sample.txt' just had the string '1.6180339887'
How can program_2.py read from sys.stdout of the previous program as it's sys.stdin?
In this simple example, I am just trying to get program_2.py to output '1.6180339887' to sys.stdout so I can see how this works.
Somebody told me to use the | character for pipelines but I couldn't make it work correctly
Using a pipe is correct:
python program-1.py sample.txt | python program-2.py
Here's a complete example:
$ cat sample.txt
hello
$ cat program-1.py
import sys
print open(sys.argv[1]).read()
$ cat program-2.py
import sys
print("program-2.py on stdin got: " + sys.stdin.read())
$ python program-1.py sample.txt
hello
$ python program-1.py sample.txt | python program-2.py
program-2.py on stdin got: hello
(PS: you could have included a complete test case in your question. That way, people could say what you did wrong instead of writing their own)
program-1.py:
import sys
if len(sys.argv) == 2:
with open(sys.argv[1], 'r') as f:
sys.stdout.write(f.read())
program-2.py:
import sys
ret = sys.stdin.readline()
print ret
sample.txt
1.6180339887
in your shell:
Mu$ python p1.py txt | python p2.py
output:
1.6180339887
more info here:
http://en.wikipedia.org/wiki/Pipeline_(Unix) and here http://en.wikibooks.org/wiki/Python_Programming/Input_and_Output