Is it possible to allow Python to read from stdin from another source such as a file continually? Basically I'm trying to allow my script to use stdin to echo input and I'd like to use a file or external source to interact with it (while remaining open).
An example might be (input.py):
#!/usr/bin/python
import sys
line = sys.stdin.readline()
while line:
print line,
line = sys.stdin.readline()
Executing this directly I can continuously enter text and it echos back while the script remains alive. If you want to use an external source though such as a file or input from bash then the script exits immediately after receiving input:
$ echo "hello" | python input.py
hello
$
Ultimately what I'd like to do is:
$ tail -f file | python input.py
Then if the file updates have input.py echo back anything that is added to file while remaining open. Maybe I'm approaching this the wrong way or I'm simply clueless, but is there a way to do it?
Use the -F option to tail to make it reopen the file if it gets renamed or deleted and a new file is created with the original name. Some editors write the file this way, and logfile rotation scripts also usually work this way (they rename the original file to filename.1, and create a new log file).
$ tail -F file | python input.py
Related
In a Python project, I need the output of an external (non-Python) command.
Let's call it identify_stuff.*
Command-line scenario
When called from command-line, this command requires a file name as an argument.
If its input is generated dynamically, we cannot pipe it into the command – this doesn't work:
cat input/* | ./identify_stuff > output.txt
cat input/* | ./identify_stuff - > output.txt
... it strictly requires a file name it can open, so one needs to create a temporary file on disk for the output of the first command, from where the second command can read the data.
However, the identify_stuff program really iterates over the input lines only once, no seeking or re-reading is involved.
So in Bash we can avoid the temporary file with the <(...) construct.
This works:
./identify_stuff <(cat input/*) > output.txt
This pipes the output of the first command to some device at a path /dev/fdX, which can be used for opening a stream like the path to a regular file on disk.
Actual scenario: call from within Python
Now, instead of just cat input/*, the input text is created inside a Python program, which continues to run after the output of identify_stuff has been captured.
The natural choice for calling an external command is the standard-library's subprocess.run().
For performance reasons, I would like to avoid creating a file on-disk.
Is there any way to do this with the subprocess tools?
The stdin and input parameters of subprocess.run won't work, because the external command ignores STDIN and specifically requires a file-name argument.
*Actually, it's this tool: https://github.com/jakelever/Ab3P/blob/master/identify_abbr.C
I'm currently trying run the shell script by using the os.system method in python.
Python code:
file = open("History.txt","w")
file.write(history)
os.system('./TFPupload.sh')
Shell script code:
#!/bin/sh
HOST="ftp.finalyearproject95.com"
USER='*****'
PASSWD='*****'
FILE='History.txt'
ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
put $FILE
quit
END_SCRIPT
echo ">>>uploaded<<<\n"
exit 0
At first, when i tried to run the python code and shell script one by one it works perfectly. However, when i attempt to use python to run shell script, instead of uploading the 'History.txt' file that contains data into the database, the file uploaded is an empty file. When i check using 'nano History.txt', it does contain data, only when it passes the text file to the database will be empty. Why is it?
Use With statement to open files whenever possible .
with open("History.txt","w") as file :
file.write(history)
os.system('./TFPupload.sh')
with statement takes care of closing the fd on its own .
some Ref : What is the python "with" statement designed for?
I am trying to write a small program in bash and part of it needs to be able to get some values from a txt file where the different files are separated by a line, and then either add each line to a variable or add each line to one array.
So far I have tried this:
FILE=$"transfer_config.csv"
while read line
do
MYARRAY[$index]="$line"
index=$(($index+1))
done < $FILE
echo ${MYARRAY[0]}
This just produces a blank line though, and not what was on the first line of the config file.
I am not returned with any errors which is why I am not too sure why this is happening.
The bash script is called though a python script using os.system("$HOME/bin/mcserver_config/server_transfer/down/createRemoteFolder"), but if I simply call it after the python program has made the file which the bash script reads, it works.
I am almost 100% sure it is not an issue with the directories, because pwd at the top of the bash script shows it in the correct directory, and the python program is also creating the data file in the correct place.
Any help is much appreciated.
EDIT:
I also tried the subprocess.call("path_to_script", shell=True) to see if it would make a difference, I know it is unlikely but it didn't.
I suspect that when calling the bash script from python, having just created the file, you are not really finished with that file: you should either explicitly close the file or use a with construct.
Otherwise, the written data is still in any buffer (from the file object, or in the OS, or wherever). Only closing (or at least flushing) the file makes sure the data is indeed in the file.
BTW, instead of os.system, you should use the subprocess module...
So I know how to execute a python script and have it output in command window using os.command or even subprocess. Also i know that capturing output into a text file is done via os.command('my command > me.txt".
My question here is:
IS there a way to do both of them with one command? ie execute a python script,capture the output to a text file AND have it show on the command window?
If this is not possible here is another question:
the command I want to execute takes up to 8 min to finish and writes up to 300 lines in a text file. Can I access the text file while the command is still executing in order to get some data without waiting for the command to finish executing? like access the text file every 1 minute for example?
If neither is possible then this would also do the job:
when my command executes succesfully it prints out a Done statement on the cmd as well as many other lines. Can i check in the cmd if that "Done" string was printed or it needs to be captured in a text file for that dearch to happen?
The easiest way to save the output of a command while echoing it to stdout is to use tee:
$ python script.py | tee logfile.log
if you want to follow the output of a file while it is being written use tail:
$ tail -f logfile
you might want to unbuffer or flush the output immediately to be able to read the output before a full line or a buffer is filled up:
$ unbuffer python script.py > logfile
$ tail -f logfile
or
print ("to file..", flush = True)
If you can do this from within your script rather than from the command line it would be quite easy.
with open("output.txt", "w") as output:
print>>output, "what you want as output" #prints to output file
print "what you want as output" #prints to screen
The easier way I devised is to create a function that prints to both screen and to file. The example below works when you input the output file name as an argument:
OutputFile= args.Output_File
if args.Output_File:
OF = open(OutputFile, 'w')
def printing(text):
print text
if args.Output_File:
OF.write(text + "\n")
#To print a line_of_text both to screen and file all you need to do is:
printing(line_of_text)
I have a simple python script that just takes in a filename, and spits out a modified version of that file. I would like to redirect stdout (using '>' from the command line) so that I can use my script to overwrite a file with my modifications, e.g. python myScript.py test.txt > test.txt
When I do this, the resulting test.txt does not contain any text from the original test.txt - just the additions made by myScript.py. However, if I don't redirect stdout, then the modifications come out correctly.
To be more specific, here's an example:
myScript.py:
#!/usr/bin/python
import sys
fileName = sys.argv[1]
sys.stderr.write('opening ' + fileName + '\n')
fileHandle = file(fileName)
currFile = fileHandle.read()
fileHandle.close()
sys.stdout.write('MODIFYING\n\n' + currFile + '\n\nMODIFIED!\n')
test.txt
Hello World
Result of python myScript.py test.txt > test.txt:
MODIFYING
MODIFIED!
The reason it works that way is that, before Python even starts, Bash interprets the redirection operator and opens an output stream to write stdout to the file. That operation truncates the file to size 0 - in other words, it clears the contents of the file. So by the time your Python script starts, it sees an empty input file.
The simplest solution is to redirect stdout to a different file, and then afterwards, rename it to the original filename.
python myScript.py test.txt > test.out && mv test.out test.txt
Alternatively, you could alter your Python script to write the modified data back to the file itself, so you wouldn't have to redirect standard output at all.
Try redirecting it to a new file, the redirect operator probably deletes the file just before appending.
The utility sponge present in the moreutils package in Debian can handle this gracefully.
python myScript.py test.txt | sponge test.txt
As indicated by its name, sponge will drain its standard input completely before opening test.txt and writing out the full contents of stdin.
You can get the latest version of sponge here. The moreutils home page is here.