How to check if subprocess terminated properly? [duplicate] - python

This question already has answers here:
Retrieving the output of subprocess.call() [duplicate]
(7 answers)
Closed 8 years ago.
I want to know if the subprocess.call() has terminated correctly without any error in the called function. For example, in the below code, if the path provided is not appropriate, the ls command gives an error as:
ERROR:No such file or directory.
I want same output to be stored as string.
import subprocess
path = raw_input("Enter the path")
subprocess.call(["ls","-l",path])

from subprocess import Popen, PIPE
p = Popen(["ls", "-l", path], stdin=PIPE, stdout=PIPE, stderr=PIPE)
output, err = p.communicate()
status = p.returncode
if status:
# something went wrong
pass
else:
# we are ok
pass
Although consider using os.listdir

You cannot do that with call, because what it does is only:
Run the command described by args. Wait for command to complete, then return the returncode attribute.
So you can only determine the return code of a program, which usually means zero if no error occurred, and non-zero otherwise.
Use the check_output method from the same module:
try:
result = subprocess.check_output(["ls", "-l", path],
stderr = subprocess.STDOUT)
print result
except subprocess.CalledProcessError, e:
print "Error:", e.output
Here is a working demo.

Related

Why does subprocess hang when reading from stdout [duplicate]

This question already has answers here:
process.stdout.readline() hangs. How to use it properly?
(3 answers)
Closed 6 years ago.
I have a script called 'my_script.py' with the following contents:
my_input = ''
while my_input != 'quit':
my_input = raw_input()
print(my_input)
Then in the console, the following commands:
from subprocess import *
p1 = Popen(['python', 'my_script.py'], stdin=PIPE)
p1.stdin.write('some words\n')
prints "some words", but if instead I write
from subprocess import *
p2 = Popen(['python', 'my_script.py'], stdin=PIPE, stdout=PIPE)
p2.stdin.write('some words\n')
p2.stdout.readline()
the shell will hang and I have to terminate it manually. How can I get this to work if I want to be able to access the stdout of the script? I'm using python2.7
Edit: To clarify my question, the above snippet will run properly for other executables that have an I/O loop (the particular one I'm working with is the stockfish chess engine). Is there a way I can modify my_script.py such that the above snippet will run properly? Using
Popen(['python3', 'my_script.py'], ...)
will work, but is it not possible using Python 2.7?
It could happen due to deadlock in readline.
You need to use communicate method to read asynchronously.
Example:
def run_shell_command(cmd, params):
cmdline = [cmd]
p = subprocess.Popen(cmdline, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout, stderr = p.communicate()
if p.returncode != 0:
raise RuntimeError("%r failed, status code %s stdout %r stderr %r" % (
cmd, p.returncode, stdout, stderr))
return stdout.strip() # This is the stdout from the shell command
To execute command in background you can use following example:
def run_shell_remote_command_background(cmd):
cmdline = [cmd]
cmdline = cmdline + params.split(' ')
subprocess.Popen(['cmdline'])

How to execute and save result of an OS command to a file [duplicate]

This question already has answers here:
How to redirect output with subprocess in Python?
(6 answers)
Closed 7 years ago.
In python 2.7, I would like to execute an OS command (for example 'ls -l' in UNIX) and save its output to a file. I don't want the execution results to show anywhere else other than the file.
Is this achievable without using os.system?
Use subprocess.check_call redirecting stdout to a file object:
from subprocess import check_call, STDOUT, CalledProcessError
with open("out.txt","w") as f:
try:
check_call(['ls', '-l'], stdout=f, stderr=STDOUT)
except CalledProcessError as e:
print(e.message)
Whatever you what to do when the command returns a non-zero exit status should be handled in the except. If you want a file for stdout and another to handle stderr open two files:
from subprocess import check_call, STDOUT, CalledProcessError, call
with open("stdout.txt","w") as f, open("stderr.txt","w") as f2:
try:
check_call(['ls', '-l'], stdout=f, stderr=f2)
except CalledProcessError as e:
print(e.message)
Assuming you just want to run a command have its output go into a file, you could use the subprocess module like
subprocess.call( "ls -l > /tmp/output", shell=True )
though that will not redirect stderr
You can open a file and pass it to subprocess.call as the stdout parameter and the output destined for stdout will go to the file instead.
import subprocess
with open("result.txt", "w") as f:
subprocess.call(["ls", "-l"], stdout=f)
It wont catch any output to stderr though that would have to be redirected by passing a file to subprocess.call as the stderr parameter. I'm not certain if you can use the same file.

Call in python return code status only

I have the following code in python:
i = call(["salt-cloud", "-m", fileName, "--assume-yes"])
print (i)
i is always 0, because the operation is finished.
The problem is that I want to get the output of this operation. In our case:
window:
----------
Error:
----------
Not Deployed:
Failed to start Salt on host window
is the output of running salt-cloud -m fileName --assume-yes, and it is an indication that error raised in this process, and I want to know it.
How can I achieve it in Python?
Use check_output and catch a CalledProcessError which will be raised for any non-zero exit status:
from subprocess import check_output, CalledProcessError
try:
out = check_call(["salt-cloud", "-m", fileName, "--assume-yes"])
print(out)
except CalledProcessError as e:
print(e.message)
If you are using python < 2.7 you will need Popen, check_call and check_output were added in python 2.7:
from subprocess import Popen, PIPE
p = Popen(["salt-cloud", "-m", filename, "--assume-yes"],stderr=PIPE,stdout=PIPE)
out, err = p.communicate()
print( err if err else out)
Assuming you're using subprocess.call, you might want to take a look at subprocess.check_output(), which returns the output of the subprocess as a byte string.

Catching runtime error for process created by python subprocess

I am writing a script which can take a file name as input, compile it and run it.
I am taking the name of a file as input(input_file_name). I first compile the file from within python:
self.process = subprocess.Popen(['gcc', input_file_name, '-o', 'auto_gen'], stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.STDOUT, shell=False)
Next, I'm executing the executable using the same(Popen) call:
subprocess.Popen('./auto_gen', stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.STDOUT, shell=False)
In both cases, I'm catching the stdout(and stderr) contents using
(output, _) = self.process.communicate()
Now, if there is an error during compilation, I am able to catch the error because the returncode is 1 and I can get the details of the error because gcc sends them on stderr.
However, the program itself can return a random value even on executing successfully(because there might not be a "return 0" at the end). So I can't catch runtime errors using the returncode. Moreover, the executable does not send the error details on stderr. So I can't use the trick I used for catching compile-time errors.
What is the best way to catch a runtime error OR to print the details of the error? That is, if ./auto_gen throws a segmentation fault, I should be able to print either one of:
'Runtime error'
'Segmentation Fault'
'Program threw a SIGSEGV'
Try this. The code runs a subprocess which fails and prints to stderr. The except block captures the specific error exit code and stdout/stderr, and displays it.
#!/usr/bin/env python
import subprocess
try:
out = subprocess.check_output(
"ls non_existent_file",
stderr=subprocess.STDOUT,
shell=True)
print 'okay:',out
except subprocess.CalledProcessError as exc:
print 'error: code={}, out="{}"'.format(
exc.returncode, exc.output,
)
Example output:
$ python ./subproc.py
error: code=2, out="ls: cannot access non_existent_file: No such file or directory
"
If ./autogen is killed by a signal then self.process.returncode (after .wait() or .communicate()) is less than zero and its absolute value reports the signal e.g., returncode == -11 for SIGSERV.
please check following link for runtime errors or output of subprocess
https://www.endpoint.com/blog/2015/01/28/getting-realtime-output-using-python
def run_command(command):
process = subprocess.Popen(shlex.split(command),
stdout=subprocess.PIPE)
while True:
output = process.stdout.readline()
if output == '' and process.poll() is not None:
break
if output:
print output.strip()
rc = process.poll()
return rc

Retrieving the output of subprocess.call() [duplicate]

This question already has answers here:
Store output of subprocess.Popen call in a string [duplicate]
(15 answers)
Closed 4 years ago.
How can I get the output of a process run using subprocess.call()?
Passing a StringIO.StringIO object to stdout gives this error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/subprocess.py", line 444, in call
return Popen(*popenargs, **kwargs).wait()
File "/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/subprocess.py", line 588, in __init__
errread, errwrite) = self._get_handles(stdin, stdout, stderr)
File "/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/subprocess.py", line 945, in _get_handles
c2pwrite = stdout.fileno()
AttributeError: StringIO instance has no attribute 'fileno'
>>>
If you have Python version >= 2.7, you can use subprocess.check_output which basically does exactly what you want (it returns standard output as string).
Simple example (linux version, see note):
import subprocess
print subprocess.check_output(["ping", "-c", "1", "8.8.8.8"])
Note that the ping command is using linux notation (-c for count). If you try this on Windows remember to change it to -n for same result.
As commented below you can find a more detailed explanation in this other answer.
Output from subprocess.call() should only be redirected to files.
You should use subprocess.Popen() instead. Then you can pass subprocess.PIPE for the stderr, stdout, and/or stdin parameters and read from the pipes by using the communicate() method:
from subprocess import Popen, PIPE
p = Popen(['program', 'arg1'], stdin=PIPE, stdout=PIPE, stderr=PIPE)
output, err = p.communicate(b"input data that is passed to subprocess' stdin")
rc = p.returncode
The reasoning is that the file-like object used by subprocess.call() must have a real file descriptor, and thus implement the fileno() method. Just using any file-like object won't do the trick.
See here for more info.
For python 3.5+ it is recommended that you use the run function from the subprocess module. This returns a CompletedProcess object, from which you can easily obtain the output as well as return code.
from subprocess import PIPE, run
command = ['echo', 'hello']
result = run(command, stdout=PIPE, stderr=PIPE, universal_newlines=True)
print(result.returncode, result.stdout, result.stderr)
I have the following solution. It captures the exit code, the stdout, and the stderr too of the executed external command:
import shlex
from subprocess import Popen, PIPE
def get_exitcode_stdout_stderr(cmd):
"""
Execute the external command and get its exitcode, stdout and stderr.
"""
args = shlex.split(cmd)
proc = Popen(args, stdout=PIPE, stderr=PIPE)
out, err = proc.communicate()
exitcode = proc.returncode
#
return exitcode, out, err
cmd = "..." # arbitrary external command, e.g. "python mytest.py"
exitcode, out, err = get_exitcode_stdout_stderr(cmd)
I also have a blog post on it here.
Edit: the solution was updated to a newer one that doesn't need to write to temp. files.
I recently just figured out how to do this, and here's some example code from a current project of mine:
#Getting the random picture.
#First find all pictures:
import shlex, subprocess
cmd = 'find ../Pictures/ -regex ".*\(JPG\|NEF\|jpg\)" '
#cmd = raw_input("shell:")
args = shlex.split(cmd)
output,error = subprocess.Popen(args,stdout = subprocess.PIPE, stderr= subprocess.PIPE).communicate()
#Another way to get output
#output = subprocess.Popen(args,stdout = subprocess.PIPE).stdout
ber = raw_input("search complete, display results?")
print output
#... and on to the selection process ...
You now have the output of the command stored in the variable "output". "stdout = subprocess.PIPE" tells the class to create a file object named 'stdout' from within Popen. The communicate() method, from what I can tell, just acts as a convenient way to return a tuple of the output and the errors from the process you've run. Also, the process is run when instantiating Popen.
The key is to use the function subprocess.check_output
For example, the following function captures stdout and stderr of the process and returns that as well as whether or not the call succeeded. It is Python 2 and 3 compatible:
from subprocess import check_output, CalledProcessError, STDOUT
def system_call(command):
"""
params:
command: list of strings, ex. `["ls", "-l"]`
returns: output, success
"""
try:
output = check_output(command, stderr=STDOUT).decode()
success = True
except CalledProcessError as e:
output = e.output.decode()
success = False
return output, success
output, success = system_call(["ls", "-l"])
If you want to pass commands as strings rather than arrays, use this version:
from subprocess import check_output, CalledProcessError, STDOUT
import shlex
def system_call(command):
"""
params:
command: string, ex. `"ls -l"`
returns: output, success
"""
command = shlex.split(command)
try:
output = check_output(command, stderr=STDOUT).decode()
success = True
except CalledProcessError as e:
output = e.output.decode()
success = False
return output, success
output, success = system_call("ls -l")
In Ipython shell:
In [8]: import subprocess
In [9]: s=subprocess.check_output(["echo", "Hello World!"])
In [10]: s
Out[10]: 'Hello World!\n'
Based on sargue's answer. Credit to sargue.

Categories