I want something like this
run the 'ls' command and output on STDOUT and want to store same output in variable
For long running process I need to see the executing output on screen and also at last capture on variable
proc = subprocess.Popen(["ls"], stdout=subprocess.PIPE, shell=False)
(out, err) = proc.communicate()
print "program output:-", out
here the output coming after execution
To print output line-by-line as soon as child processes flushes its stdout and to store it in a variable:
from subprocess import Popen, PIPE
buf = []
proc = Popen([cmd], stdout=PIPE, bufsize=1)
for line in iter(proc.stdout.readline, b''):
buf.append(line)
print line,
proc.communicate() # close `proc.stdout`; wait for the child process to exit
output = b"".join(buf)
There could be a buffering issue (the output appears with a delay); to fix it, you could use pexpect, pty modules or stdbuf, unbuffer, script commands.
Related
I have a python program which executes subprocess.Popen, like this;
process = subprocess.Popen(stand_alone_command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out, err = process.communicate()
print "out: ", out
print "err: ", err
If my stand_alone_command will run forever, how do I get whatever stand_alone_command is throwing at STDOUT and STDERR so that I can log it.
Try reading from stdout instead of calling communicate() such as..
import subprocess
sac = ['tail', '-f', '/var/log/syslog']
process = subprocess.Popen(sac, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
while 1:
line = process.stdout.readline()
if line:
print(line)
I think you'll need to set shell=False but I'm on Linux and Windows is a bit different.
I am using subprocess module to interact with output of the linux commands. below is my code.
import subprocess
import sys
file_name = 'myfile.txt'
p = subprocess.Popen("grep \"SYSTEM CONTROLLER\" "+ file_name, stdout=subprocess.PIPE, shell=True)
(output, err) = p.communicate()
print output.strip()
p = subprocess.Popen("grep \"controller\|worker\" "+ file_name, stdout=subprocess.PIPE, shell=True)
(output, err) = p.communicate()
lines = output.rstrip().split("\n")
print lines
My program hangs while executing second subprocess i.e.
p = subprocess.Popen("grep \"controller\|worker\""+ file_name,stdout=subprocess.PIPE, shell=True)
I got to know that the reason of process hang is buffer redirected to subprocess.PIPE is getting filled, which blocks the process from writing further.
I want to know if there is any way to avoid the buffer full situation so that my program keeps on executing without any hang issue ?
The actual issue is that there is a whitespace missing between the pattern and the filename and therefore grep waits for input on the standard input (stdin).
"buffer full" (.communicate() is not susceptible) or p.stdout.read() (it fixes nothing: it loads the output into memory and unlike .communicate() it fails if more than one pipe is used) are a red herring here.
Drop shell=True and use a list argument for the command:
#!/usr/bin/env python
from subprocess import Popen, PIPE
p = Popen(["grep", r"controller\|worker", file_name], stdout=PIPE)
output = p.communicate()[0]
if p.returncode == 0:
print('found')
elif p.returncode == 1:
print('not found')
else:
print('error')
As it says at https://docs.python.org/3/library/subprocess.html#subprocess.Popen.communicate:
Note: The data read is buffered in memory, so do not use this method
if the data size is large or unlimited.
Instead, use the file objects to read the text as it is produced:
output = p.stdout.read()
As long as no other pipes (e.g. stderr) fill up while you are reading, the process shouldn't be blocked.
I'm trying to process both stdout and stderr from a subprocess.Popen call that captures both via subprocess.PIPE but would like to handle the output (for example printing them on the terminal) as it comes.
All the current solutions that I've seen will wait for the completion of the Popen call to ensure that all of the stdout and stderr is captured so that then it can be processed.
This is an example Python script with mixed output that I can't seem to replicate the order when processing it in real time (or as real time as I can):
$ cat mix_out.py
import sys
sys.stdout.write('this is an stdout line\n')
sys.stdout.write('this is an stdout line\n')
sys.stderr.write('this is an stderr line\n')
sys.stderr.write('this is an stderr line\n')
sys.stderr.write('this is an stderr line\n')
sys.stdout.write('this is an stdout line\n')
sys.stderr.write('this is an stderr line\n')
sys.stdout.write('this is an stdout line\n')
The one approach that seems that it might work would be using threads, because then the reading would be asynchronous, and could be processed as subprocess is yielding the output.
The current implementation of this just process stdout first and stderr last, which can be deceiving if the output was originally alternating between both:
cmd = ['python', 'mix_out.py']
process = subprocess.Popen(
cmd,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
close_fds=True,
**kw
)
if process.stdout:
while True:
out = process.stdout.readline()
if out == '' and process.poll() is not None:
break
if out != '':
print 'stdout: %s' % out
sys.stdout.flush()
if process.stderr:
while True:
err = process.stderr.readline()
if err == '' and process.poll() is not None:
break
if err != '':
print 'stderr: %s' % err
sys.stderr.flush()
If I run the above (saved as out.py) to handle the mix_out.py example script from above, the streams are (as expected) handled in order:
$ python out.py
stdout: this is an stdout line
stdout: this is an stdout line
stdout: this is an stdout line
stdout: this is an stdout line
stderr: this is an stderr line
stderr: this is an stderr line
stderr: this is an stderr line
stderr: this is an stderr line
I understand that some system calls might buffer, and I am OK with that, the one thing I am looking to solve is respecting the order of the streams as they happened.
Is there a way to be able to process both stdout and stderr as it comes from subprocess without having to use threads? (the code gets executed in restricted remote systems where threading is not possible).
The need to differentiate stdout from stderr is a must (as shown in the example output)
Ideally, no extra libraries would be best (e.g. I know pexpect solves this)
A lot of examples out there mention the use of select but I have failed to come up with something that would preserve the order of the output with it.
If you are looking for a way of having subprocess.Popen` output to stdout/stderr in realtime, you should be able to achieve that with:
import sys, subprocess
p = subprocess.Popen(cmdline,
stdout=sys.stdout,
stderr=sys.stderr)
Maybe using stderr=subprocess.STDOUT may simplify your filtering, IMO.
I found working example here (see listing of capture_together.py). Compiled C++ code that mixes cerr and cout executed as subprocess on both Windows and UNIX OSes. Results are identitical
I was able to solve this by using select.select()
process = subprocess.Popen(
cmd,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
close_fds=True,
**kw
)
while True:
reads, _, _ = select(
[process.stdout.fileno(), process.stderr.fileno()],
[], []
)
for descriptor in reads:
if descriptor == process.stdout.fileno():
read = process.stdout.readline()
if read:
print 'stdout: %s' % read
if descriptor == process.stderr.fileno():
read = process.stderr.readline()
if read:
print 'stderr: %s' % read
sys.stdout.flush()
if process.poll() is not None:
break
By passing in the file descriptors to select() on the reads argument (first argument for select()) and looping over them (as long as process.poll()indicated that the process was still alive).
No need for threads. Code was adapted from this stackoverflow answer
I'm running a script via Python's subprocess module. Currently I use:
p = subprocess.Popen('/path/to/script', stdout=subprocess.PIPE, stderr=subprocess.PIPE)
result = p.communicate()
I then print the result to the stdout. This is all fine but as the script takes a long time to complete, I wanted real time output from the script to stdout as well. The reason I pipe the output is because I want to parse it.
To save subprocess' stdout to a variable for further processing and to display it while the child process is running as it arrives:
#!/usr/bin/env python3
from io import StringIO
from subprocess import Popen, PIPE
with Popen('/path/to/script', stdout=PIPE, bufsize=1,
universal_newlines=True) as p, StringIO() as buf:
for line in p.stdout:
print(line, end='')
buf.write(line)
output = buf.getvalue()
rc = p.returncode
To save both subprocess's stdout and stderr is more complex because you should consume both streams concurrently to avoid a deadlock:
stdout_buf, stderr_buf = StringIO(), StringIO()
rc = teed_call('/path/to/script', stdout=stdout_buf, stderr=stderr_buf,
universal_newlines=True)
output = stdout_buf.getvalue()
...
where teed_call() is define here.
Update: here's a simpler asyncio version.
Old version:
Here's a single-threaded solution based on child_process.py example from tulip:
import asyncio
import sys
from asyncio.subprocess import PIPE
#asyncio.coroutine
def read_and_display(*cmd):
"""Read cmd's stdout, stderr while displaying them as they arrive."""
# start process
process = yield from asyncio.create_subprocess_exec(*cmd,
stdout=PIPE, stderr=PIPE)
# read child's stdout/stderr concurrently
stdout, stderr = [], [] # stderr, stdout buffers
tasks = {
asyncio.Task(process.stdout.readline()): (
stdout, process.stdout, sys.stdout.buffer),
asyncio.Task(process.stderr.readline()): (
stderr, process.stderr, sys.stderr.buffer)}
while tasks:
done, pending = yield from asyncio.wait(tasks,
return_when=asyncio.FIRST_COMPLETED)
assert done
for future in done:
buf, stream, display = tasks.pop(future)
line = future.result()
if line: # not EOF
buf.append(line) # save for later
display.write(line) # display in terminal
# schedule to read the next line
tasks[asyncio.Task(stream.readline())] = buf, stream, display
# wait for the process to exit
rc = yield from process.wait()
return rc, b''.join(stdout), b''.join(stderr)
The script runs '/path/to/script command and reads line by line both its stdout&stderr concurrently. The lines are printed to parent's stdout/stderr correspondingly and saved as bytestrings for future processing. To run the read_and_display() coroutine, we need an event loop:
import os
if os.name == 'nt':
loop = asyncio.ProactorEventLoop() # for subprocess' pipes on Windows
asyncio.set_event_loop(loop)
else:
loop = asyncio.get_event_loop()
try:
rc, *output = loop.run_until_complete(read_and_display("/path/to/script"))
if rc:
sys.exit("child failed with '{}' exit code".format(rc))
finally:
loop.close()
p.communicate() waits for the subprocess to complete and then returns its entire output at once.
Have you tried something like this instead, where you read the subprocess output line-by-line?
p = subprocess.Popen('/path/to/script', stdout=subprocess.PIPE, stderr=subprocess.PIPE)
for line in p.stdout:
# do something with this individual line
print line
The Popen.communicate doc clearly states:
Note: The data read is buffered in memory, so do not use this method if the data size is large or unlimited.
https://docs.python.org/2/library/subprocess.html#subprocess.Popen.communicate
So if you need realtime output, you need to use something like this:
stream_p = subprocess.Popen('/path/to/script', stdout=subprocess.PIPE, stderr=subprocess.PIPE)
while stream_line in stream_p:
#Parse it the way you want
print stream_line
This prints both stdout and stderr to the terminal as well as saving both stdout and stderr into a variable:
from subprocess import Popen, PIPE, STDOUT
with Popen(args, stdout=PIPE, stderr=STDOUT, text=True, bufsize=1) as p:
output = "".join([print(buf, end="") or buf for buf in p.stdout])
However, depending on what exactly you're doing, this might be important to note: By using stderr=STDOUT, we cannot differentiate between stdout and stderr anymore and with the call to print, your output will always be printed to stdout, doesn't matter if it came from stdout or stderr.
For Python < 3.7 you will need to use universal_newlines instead of text.
New in version 3.7: text was added as a more readable alias for universal_newlines.
Source: https://docs.python.org/3/library/subprocess.html#subprocess.Popen
I am trying to manipulate/strip the output of 7zip command and intimate user about the progress of process. The sample code i am trying to use is below:
import subprocess
proc = subprocess.Popen(['7zip','arg', 'archive'],shell=True, stdin=None, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
while True:
line = proc.stdout.readline()
if line != '':
#Do some striping and update pyqt label
print "test:", line.rstrip()
sys.stdout.flush()
else:
break
However, the real problem is that print statement only print stdout after completion of the process. Is there a way to capture the stdout line by line then manipulate and print again?
Update
Updated the script to include sys.stdout.flush()
Yes, the popen family of calls.
You can see the documentation here
(child_stdin,
child_stdout,
child_stderr) = os.popen3("cmd", mode, bufsize)
==>
p = Popen("cmd", shell=True, bufsize=bufsize,
stdin=PIPE, stdout=PIPE, stderr=PIPE, close_fds=True)
(child_stdin,
child_stdout,
child_stderr) = (p.stdin, p.stdout, p.stderr)
they give you filedescriptors to the streams and you can use them to read the output of the called program.