Redirect subprocess.Popen stderr to console - python

I am executing a make command using subprocess.Popen. But when the make fails I do not get the exact error from make and th escript just continues to run. How do I get the script to stop and show the console exactly the output of the make command
def app(self, build, soc, target):
command = "make BUILD=%s SOC=%s TARGET=%s" % (build, soc, target)
subprocess.Popen(command.split(), shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE).communicate()

Could you try replacing:
subprocess.Popen(command.split(), shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate()
with:
p = subprocess.Popen(command.split(), shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
print p.communicate()
print p.returncode
And let us know what the printed output looks like.

If you want the make output to actually go to the console, don't use subprocess.PIPE for stdout/stderr. By default, the called process will use the Python process's stdout/stderr handles. In that case, you can use the subprocess.check_call() function to raise a subprocess.CalledProcessError if the called process returns a non-zero exit code:
subprocess.check_call(command.split())
However, if you need to capture the make output for use in your script, you can use the similar subprocess.check_output() function:
try:
output = subprocess.check_output(command.split(), stderr=subprocess.STDOUT)
except subprocess.CalledProcessError as e:
output = e.output
# error handling here
Note that this combines the stdout and stderr output into a single value. If you need them separately, you would need to use the subprocess.Popen constructor in conjunction with the .communicate() method and manually checking the returncode attribute of the Popen object:
p = subprocess.Popen(command.split(), stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
out, err = p.communicate()
if p.returncode != 0:
# raise exception or other error handling here

Related

How do I capture output of git commands with python subprocess.Popen [duplicate]

I am running a subprocess using 'Popen'. I need to block till this subprocess finishes and then read its output.
p = Popen(command, stdin=PIPE, stdout=PIPE, stderr=PIPE, encoding="utf-8")
p.communicate():
output = p.stdout.readline()
print(output)
I get an error that
ValueError: I/O operation on closed file.
How can I read the output after the subprocess finishes, I do not want to use poll() though as the subprocess takes time and I would need to wait for its completion anyway.
This should work:
p = Popen(command, stdin=PIPE, stdout=PIPE, stderr=PIPE, encoding="utf-8")
output, error = p.communicate()
print(output)
if error:
print('error:', error, file=sys.stderr)
However, subprocess.run() is preferred these days:
p = subprocess.run(command, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
print("output:", p.stdout)
if proc.stderr:
print("error:", p.stderr, file=sys.stderr)
Use subprocess.check_output. It returns the output of the command.

How to run a non ending process via subprocess and collect both STDOUT and STDERR separately

I have a python program which executes subprocess.Popen, like this;
process = subprocess.Popen(stand_alone_command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out, err = process.communicate()
print "out: ", out
print "err: ", err
If my stand_alone_command will run forever, how do I get whatever stand_alone_command is throwing at STDOUT and STDERR so that I can log it.
Try reading from stdout instead of calling communicate() such as..
import subprocess
sac = ['tail', '-f', '/var/log/syslog']
process = subprocess.Popen(sac, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
while 1:
line = process.stdout.readline()
if line:
print(line)
I think you'll need to set shell=False but I'm on Linux and Windows is a bit different.

python subprocess missing arguments

Have been trying to get something like this to work for a while, the below doesn't seem to be sending the correct arg to the c program arg_count, which outputs argc = 1. When I'm pretty sure I would like it to be 2. ./arg_count -arg from the shell outputs 2...
I have tried with another arg (so it would output 3 in the shell) and it still outputs 1 when calling via subprocess.
import subprocess
pipe = subprocess.Popen(["./args/Release/arg_count", "-arg"], shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out, err = pipe.communicate()
result = out.decode()
print "Result : ",result
print "Error : ",err
Any idea where im falling over? I'm running linux btw.
From the documentation:
The shell argument (which defaults to False) specifies whether to use
the shell as the program to execute. If shell is True, it is
recommended to pass args as a string rather than as a sequence.
Thus,
pipe = subprocess.Popen("./args/Release/arg_count -arg", shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
should give you what you want.
If shell=True then your call is equivalent to:
from subprocess import Popen, PIPE
proc = Popen(['/bin/sh', '-c', "./args/Release/arg_count", "-arg"],
stdout=PIPE, stderr=PIPE)
i.e., -arg is passed to the shell itself and not your program. Drop shell=True to pass -arg to the program:
proc = Popen(["./args/Release/arg_count", "-arg"],
stdout=PIPE, stderr=PIPE)
If you don't need to capture stderr separately from stdout then you could use check_output():
from subprocess import check_output, STDOUT
output = check_output(["./args/Release/arg_count", "-arg"]) # or
output_and_errors = check_output(["./args/Release/arg_count", "-arg"],
stderr=STDOUT)

Catching runtime error for process created by python subprocess

I am writing a script which can take a file name as input, compile it and run it.
I am taking the name of a file as input(input_file_name). I first compile the file from within python:
self.process = subprocess.Popen(['gcc', input_file_name, '-o', 'auto_gen'], stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.STDOUT, shell=False)
Next, I'm executing the executable using the same(Popen) call:
subprocess.Popen('./auto_gen', stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.STDOUT, shell=False)
In both cases, I'm catching the stdout(and stderr) contents using
(output, _) = self.process.communicate()
Now, if there is an error during compilation, I am able to catch the error because the returncode is 1 and I can get the details of the error because gcc sends them on stderr.
However, the program itself can return a random value even on executing successfully(because there might not be a "return 0" at the end). So I can't catch runtime errors using the returncode. Moreover, the executable does not send the error details on stderr. So I can't use the trick I used for catching compile-time errors.
What is the best way to catch a runtime error OR to print the details of the error? That is, if ./auto_gen throws a segmentation fault, I should be able to print either one of:
'Runtime error'
'Segmentation Fault'
'Program threw a SIGSEGV'
Try this. The code runs a subprocess which fails and prints to stderr. The except block captures the specific error exit code and stdout/stderr, and displays it.
#!/usr/bin/env python
import subprocess
try:
out = subprocess.check_output(
"ls non_existent_file",
stderr=subprocess.STDOUT,
shell=True)
print 'okay:',out
except subprocess.CalledProcessError as exc:
print 'error: code={}, out="{}"'.format(
exc.returncode, exc.output,
)
Example output:
$ python ./subproc.py
error: code=2, out="ls: cannot access non_existent_file: No such file or directory
"
If ./autogen is killed by a signal then self.process.returncode (after .wait() or .communicate()) is less than zero and its absolute value reports the signal e.g., returncode == -11 for SIGSERV.
please check following link for runtime errors or output of subprocess
https://www.endpoint.com/blog/2015/01/28/getting-realtime-output-using-python
def run_command(command):
process = subprocess.Popen(shlex.split(command),
stdout=subprocess.PIPE)
while True:
output = process.stdout.readline()
if output == '' and process.poll() is not None:
break
if output:
print output.strip()
rc = process.poll()
return rc

Python: get output from a command line which exits with nonzero exit code

I am Using Python 2.7.1 on a Windows Server 2008 R2 x64 box.
I'm trying to get the output of a command line process which gives a nonzero exit status after outputting the information I need.
I was initially using subprocess.check_output, and catching the CalledProcessError which occurs with nonzero exit status, but while the returncode was stored in the error, no output revealed this.
Running this against cases which give output but have an exit status of 0 works properly and I can get the output using subprocess.check_output.
My assumption was that the output was being written to STDOUT but the exception pulls its 'output' from STDERR. I've tried to re implement the functionality of check_output, but I still get nothing on the output when I believe I should be seeing output to STDOUT and STDERR. My current code is below (where 'command' is the full text, including parameters, of command I am running:
process = subprocess.Popen(command, stdout=subprocess.PIPE,
stderr=subprocess.STDOUT, universal_newlines=True)
output = process.communicate()
retcode = process.poll()
if retcode:
raise subprocess.CalledProcessError(retcode, image_check, output=output)
return output
This gives me the following in the variable output: [('', None)]
Is my subprocess.Popen code correct?
You code works fine. Turns out that the process that you are calling is probably outputing to CON. See the following example
import subprocess
def check_output(command):
process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, universal_newlines=True)
output = process.communicate()
retcode = process.poll()
if retcode:
raise subprocess.CalledProcessError(retcode, command, output=output[0])
return output
command = "echo this>CON"
print "subprocess -> " + subprocess.check_output(command, shell=True)
print "native -> " + str(check_output(command))
try:
subprocess.check_output("python output.py", shell=True)
except subprocess.CalledProcessError, e:
print "subproces CalledProcessError.output = " + e.output
try:
check_output("python output.py")
except subprocess.CalledProcessError, e:
print "native CalledProcessError.output = " + e.output
Output
subprocess ->
native -> ('', None)
stderr subproces CalledProcessError.output = stdout
native CalledProcessError.output = stderr stdout
Sadly, I do not know how to resolve the issue. Notice that subprocess.check_output results contains only the output from stdout. Your check_output replacement would output both stderr and stdout.
After inspecting subprocess.check_output, it does indeed generate a CalledProcessError with the output containing only stdout.
Have you tried stderr=subprocess.STDOUT as mentioned in the python doc page:
To also capture standard error in the result, use
stderr=subprocess.STDOUT:
Here is a test code:
import subprocess
try:
subprocess.check_output('>&2 echo "errrrr"; exit 1', shell=True)
except subprocess.CalledProcessError as e:
print 'e.output: ', e.output
try:
subprocess.check_output('>&2 echo "errrrr"; exit 1', shell=True, stderr=subprocess.STDOUT)
except subprocess.CalledProcessError as e:
print 'e.output: ', e.output
output:
errrrr
e.output:
e.output: errrrr
There is an issue here that might be hitting you-
http://bugs.python.org/issue9905

Categories