Passing input to subprocess popen at runtime based on stdout string - python

I am trying to run following code
process = subprocess.Popen(args=cmd, shell=True, stdout=subprocess.PIPE)
while process.poll() is None:
stdoutput = process.stdout.readline()
print(stdoutput.decode())
if '(Y/N)' in stdoutput.decode():
process.communicate(input=b'Y\n')
this cmd argument runs for a few minutes after which it prompts for a confirmation, but the process.communicate is not working, neither is process.stdin.write()
How do I send input string 'Y' to this running process when it prompts for confirmation

Per the doc on Popen.communicate(input=None, timeout=None):
Note that if you want to send data to the process’s stdin, you need to create the Popen object with stdin=PIPE.
Please try that, and if it's not sufficient, do indicate what the symptom is.

On top of the answer from #Jerry101, if the subprocess that you are calling is a python script that uses the input(), be aware that as documented:
If the prompt argument is present, it is written to standard output without a trailing newline.
Thus if you perform readline() as in process.stdout.readline(), it would hang there waiting for the new line \n character as documented:
f.readline() reads a single line from the file; a newline character (\n) is left at the end of the string
A quick fix is append the newline \n when requesting the input() e.g. input("(Y/N)\n") instead of just input("(Y/N)").
Related question:
Python subprocess stdout doesn't capture input prompt

Related

Catch universal newlines but preserve original

So this is my problem,
I'm trying to do a simple program that runs another process using Python's subprocess module, and I want to catch real-time output of the process.
I know this can be done as such:
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE)
for line in iter(proc.stdout.readline, ""):
line = line.rstrip()
if line != "":
print(line)
The issue is, the process might generate output with a carriage return \r, and I want to simulate that behavior in my program.
If I use the universal_newlines flag in Popen, then I could catch the output that is generated with a carriage return, but I wouldn't know it was as such, and I could only print it "regularly" with a newline. I want to avoid that, as this could be a lot of output.
My question is basically if I could catch the \r output like it is a \n but differentiate it from actual \n output
EDIT
Here is some simplified code of what I tried:
File download.py:
import subprocess
try:
subprocess.check_call(
[
"aws",
"s3",
"cp",
"S3_LINK",
"TARGET",
]
)
except subprocess.CalledProcessError as err:
print(err)
raise SystemExit(1)
File process_runner.py:
import os
import sys
import subprocess
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE)
for char in iter(lambda: proc.stdout.read(1), ""):
sys.stdout.write(char)
The code in download uses aws s3 cp, which gives carriage returns of the download progress. I want to simulate this behavior of output in my program process_runner which receives download's output.
At first I tried to iter readline instead of read(1). That did not work due to the CR being overlooked.
A possible way is to use the binary interface of Popen by specifying neither encoding nor error and of course not universal_newline. And then, we can use a TextIOWrapper around the binary stream, with newline=''. Because the documentation for TextIOWrapper says:
... if newline is None... If it is '', universal newlines mode is enabled, but line endings are returned to the caller untranslated
(which is conformant with PEP 3116)
You original code could be changed to:
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE)
out = io.TextIOWrapper(proc.stdout, newline='')
for line in out:
# line is delimited with the universal newline convention and actually contains
# the original end of line, be it a raw \r, \n of the pair \r\n
...

python-sys.stdin.readlines() ,Stop reading lines in command line

I have a code with function sys.stdin.readlines().
What is the difference between the above one and sys.stdin.buffer.readlines()?.
What exactly do they do ?
If they read lines from command line,how to stop reading lines at a certain instant and proceed to flow through the program?
1) sys.stdin is a TextIOWrapper, its purpose is to read text from stdin. The resulting strings will be actual strs. sys.stdin.buffer is a BufferedReader. The lines you get from this will be byte strings
2) They read all the lines from stdin until hitting eof or they hit the limit you give them
3) If you're trying to read a single line, you can use .readline() (note: no s). Otherwise, when interacting with the program on the command line, you'd have to give it the EOF signal (Ctrl+D on *nix)
Is there a reason you are doing this rather than just calling input() to get one text line at a time from stdin?
From the docs
sys.stdin
sys.stdout
sys.stderr
File objects corresponding to the interpreter’s standard input, output and error streams. stdin is used for all interpreter input except for scripts but including calls to input(). stdout is used for the output of print() and expression statements and for the prompts of input(). The interpreter’s own prompts and (almost all of) its error messages go to stderr. stdout and stderr needn’t be built-in file objects: any object is acceptable as long as it has a write() method that takes a string argument. (Changing these objects doesn’t affect the standard I/O streams of processes executed by os.popen(), os.system() or the exec*() family of functions in the os module.)
Note: The standard streams are in text mode by default. To write or read binary data to these, use the underlying binary buffer. For example, to write bytes to stdout, use sys.stdout.buffer.write(b'abc').
So, sys.stdin.readlines() reads everything that was passed to stdin and separates the contents so lines are formed (you get a list as a result).
sys.stdin.buffer.readlines() does the same, but for buffer of stdin. I'd suggest to use the first method as the buffer may be empty while stdin may contain some data.
If you want to stop at some moment, then use readline() to read only one line at a time.

Getting output of a process at runtime

I am using a python script to run a process using subprocess.Popen and simultaneously store the output in a text file as well as print it on the console. This is my code:
result = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE)
for line in result.stdout.readlines(): #read and store result in log file
openfile.write("%s\n" %line)
print("%s" %line)
Above code works fine, but what it does is it first completes the process and stores the output in result variable. After that for loop stores the output as well as print it.
But i want the output at runtime (as my process can take hours to complete, i don't get any output for all these hours).
So is there any other function that gives me the output dynamically (at runtime), means as soon as the process gives first line, it should get printed.
The problem here is that .readlines() gets the entire output before returning, as it constructs a full list. Just iterate directly:
for line in result.stdout:
print(line)
.readlines() returns a list of all the lines the process will return while open, i.e., it doesn't return anything until all output from the subprocess is received. To read line by line in "real time":
import sys
from subprocess import Popen, PIPE
proc = Popen(cmd, shell=True, bufsize=1, stdout=PIPE)
for line in proc.stdout:
openfile.write(line)
sys.stdout.buffer.write(line)
sys.stdout.buffer.flush()
proc.stdout.close()
proc.wait()
Note: if the subprocess uses block-buffering when it is run in non-interactive mode; you might need pexpect, pty modules or stdbuf, unbuffer, script commands.
Note: on Python 2, you might also need to use iter(), to get "real time" output:
for line in iter(proc.stdout.readline, ""):
openfile.write(line)
print line,
You can iterate over the lines one by one by using readline on the pipe:
while True:
line = result.stdout.readline()
print line.strip()
if not line:
break
The lines contain a trailing \n which I stripped for printing.
When the process terminates, readline returns an empty string, so you know when to stop.

Python subprocess stdout to process stdin

Edit: My main question is, is there some way for a subprocess's stdout to be non-exclusively piped into the process's stdin. Non-exclusively so that the keyboard still works. Both need to go into a raw_input prompt.
Context: I'm creating a python program that allows people who've bought tickets with qr codes entry at an event. The main part of the program is a raw_input() on loop that searches through a csv for a guest's name, email address or the unique hash that is embedded in the QR codes. I am trying to use zbarcam to scan in the unique hash. This almost works:
from subprocess import Popen, PIPE
p = Popen(["/cygdrive/c/Program Files (x86)/ZBar/bin/zbarcam", "--raw"], stdout=PIPE)
That is, the QR code is scanned and the ticket hash pops up at the prompt so I can hit enter and have it search. The problem is zbarcam adds a newline so when I hit enter, it searches for a newline and returns everything in the csv. I can't find any way to strip the newline after the zbarcam output is piped to stdout (which enters into raw_input). Come to think of it, I can't even backspace or remove the newline using the keyboard. Do you know how to do that? I've done that sort of thing before, usually by copy-pasting an extra newline.
I added this line after the above ones:
sys.stdin = p.stdout
and the QR code's newline was interpretd as an "enter" and the search process started, but it just took away my ability to type in a search term. Is there some way to have stdin routed to both of those?
Is there some other way I can get user input from a keyboard and from zbarcam?
Thanks in advance! I hope I sound coherant--been a long day.
Edit: if anyone wants to bash themselves in the head with this profoundly hacked-together code, they can have a look here https://github.com/rtwolf/qr-event-entry/blob/master/pulp_entry.py

sys.stdin.readlines() hangs Python script

Everytime I'm executing my Python script, it appears to hang on this line:
lines = sys.stdin.readlines()
What should I do to fix/avoid this?
EDIT
Here's what I'm doing with lines:
lines = sys.stdin.readlines()
updates = [line.split() for line in lines]
EDIT 2
I'm running this script from a git hook so is there anyway around the EOF?
This depends a lot on what you are trying to accomplish. You might be able do:
for line in sys.stdin:
#do something with line
Of course, with this idiom as well as the readlines() method you are using, you need to somehow send the EOF character to your script so that it knows that the file is ready to read. (On unix Ctrl-D usually does the trick).
Unless you are redirecting something to stdin that would be expected behavior. That says to read input from stdin (which would be the console you are running the script from). It is waiting for your input.
See: "How to finish sys.stdin.readlines() input?
If you're running the program in an interactive session, then this line causes Python to read from standard input (i. e. your keyboard) until you send the EOF character (Ctrl-D (Unix/Mac) or Ctrl-Z (Windows)).
>>> import sys
>>> a = sys.stdin.readlines()
Test
Test2
^Z
>>> a
['Test\n', 'Test2\n']
I know this isn't directly answering your question, as others have already addressed the EOF issue, but typically what I've found that works best when reading live output from a long lived subprocess or stdin is the while/if line approach:
while True:
line = sys.stdin.readline()
if not line:
break
process(line)
In this case, sys.stdin.readline() will return lines of text before an EOF is returned. Once the EOF if given, the empty line will be returned which triggers the break from the loop. A hang can still occur here, as long as an EOF isn't provided.
It's worth noting that the ability to process the "live output", while the subprocess/stdin is still running, requires the writing application to flush it's output.

Categories