Take a simple command in bash
cmd='ls -l | wc -l'
I understand that we can run this command several ways using subprocess call/check_output/ communicate. The problem arise for me if initial commands do not work or fail for some reason. Like [replace ls with lsx].
cmd='lsx -l | wc -l'
In this case how can we capture error or we just have to process the output to figure out? Here is what I tried.
import subprocess
>>> subprocess.call('lsx -l | wc -l', shell=True)
/bin/sh: lsx: command not found
0
0
>>> subprocess.check_output('lsx -l | wc -l', shell=True)
/bin/sh: lsx: command not found
b' 0\n'
It seems that error code are still 0 in above two commands.
I also tried https://docs.python.org/3.5/library/subprocess.html#replacing-shell-pipeline but cannot figure out how to get error code for first process.
If you specify bash instead of sh, you can set the pipefail option to return a nonzero exit status if any part of a pipeline fails:
subprocess.check_output(['bash', '-c', 'set -o pipefail; lsx -l | wc -l'])
That said, you can certainly avoid using shell=True altogether:
import subprocess
try:
p1 = subprocess.Popen(['ls', '--invalid-argument'], stdout=subprocess.PIPE)
p2 = subprocess.Popen(['wc', '-l'], stdin=p1.stdout, stdout=subprocess.PIPE)
wc_stdout = p2.communicate()[0]
if p1.wait() != 0 or p2.wait() != 0:
raise RuntimeError("Something failed!")
except FileNotFoundError as ex:
raise RuntimeError("Something failed, because we couldn't find an executable!") from ex
You can assign stderr to PIPE. Consider this example:
>>> from subprocess import PIPE, Popen
>>> sub = Popen('lsx -l | wc -l', shell=True, stderr=PIPE, stdout=PIPE)
>>> output, error_output = sub.communicate()
>>> error_output
b'/bin/sh: 1: lsx: not found\n'
>>> output
b'0\n'
>>> sub = Popen('ls -l | wc -l', shell=True, stderr=PIPE, stdout=PIPE)
>>> output, error_output = sub.communicate()
>>> error_output
b''
I have a code so far that filters out everything except the gateway IP (route -n | awk '{if($4=="UG")print $2}'), but I'm trying to figure out how to pipe this to a variable in Python. Here's what I got:
import shlex;
from subprocess import Popen, PIPE;
cmd = "route -n | grep 'UG[ \t]' | awk '{print $2}'";
gateway = Popen(shlex.split(cmd), stdout=PIPE);
gateway.communicate();
exit_code = gateway.wait();
Any ideas?
NOTE: I'm new at this.
For better or worse, your cmd uses a shell pipeline. To use shell features in subprocess, one must set shell=True:
from subprocess import Popen, PIPE
cmd = "/sbin/route -n | grep 'UG[ \t]' | awk '{print $2}'"
gateway = Popen(cmd, shell=True, stdout=PIPE)
stdout, stderr = gateway.communicate()
exit_code = gateway.wait()
Alternatively, one could keep shell=False, eliminate the pipeline, and do all the string processing in python:
from subprocess import Popen, PIPE
cmd = "/sbin/route -n"
gateway = Popen(cmd.split(), stdout=PIPE)
stdout, stderr = gateway.communicate()
exit_code = gateway.wait()
gw = [line.split()[1] for line in stdout.decode().split('\n') if 'UG' in line][0]
Because of the vagaries of shell processing, and unless there is a specific need, it is probably best to avoid shell=True.
I have the following:
cmd = "ps aux | grep 'java -jar' | grep -v grep | awk '//{print $2}'".split(' ')
p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out, err = p.communicate()
print out
When I run the command in the console (outside of python), I get the desired output. Running this above code in python prints a blank line. I am assuming there is something up with the cmd (specifically the | operator) but I can't be sure.
I need to achieve this with the standard Python 2.6.6 install (no additional modules)
You need to use a single call to Popen() for each piece of the original command, as connected by the pipe, as in
import subprocess
p1 = subprocess.Popen(["ps", "aux"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
p2 = subprocess.Popen(["grep", "java -jar"], stdin=p1.stdout, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
p3 = subprocess.Popen(["grep", "-v", "grep"], stdin=p2.stdout, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
p4 = subprocess.Popen(["awk", "//{print $2}"], stdin=p3.stdout, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out, err = p4.communicate()
print out
The subprocess documentation has an in-depth discussion.
Popen by default only executes executables, not shell command lines.
When you pass the list of arguments to Popen they should call one executable with its arguments:
import subprocess
proc = subprocess.Popen(['ps', 'aux'])
Also note that you should not use str.split to split a command, because:
>>> "ps aux | grep 'java -jar' | grep -v grep | awk '//{print $2}'".split(' ')
['ps', 'aux', '|', 'grep', "'java", "-jar'", '', '', '', '', '|', 'grep', '-v', 'grep', '|', 'awk', "'//{print", "$2}'"]
Note how:
The arguments that were quoted (e.g. 'java -jar') are splitted.
If there is more than one consecutive space you get some empty arguments.
Python already provides a module that knows how to split a command line in a reasonable manner, it's shlex:
>>> shlex.split("ps aux | grep 'java -jar' | grep -v grep | awk '//{print $2}'")
['ps', 'aux', '|', 'grep', 'java -jar', '|', 'grep', '-v', 'grep', '|', 'awk', '//{print $2}']
Note how quoted arguments were preserved, and multiple spaces are handled gracefully. Still you cannot pass the result to Popen, because Popen will not interpret the | as a pipe by default.
If you want to run a shell command line (i.e. use any shell feature such as pipes, path expansion, redirection etc.) you must pass shell=True. In this case you should not pass a list of strings as argumento to Popen, but only a string that is the complete command line:
proc = subprocess.Popen("ps aux | grep 'java -jar' | grep -v grep | awk '//{print $2}'", shell=True)
If you pass a list of strings with shell=True its meaning is different: the first element should be the complete command line, while the other elements are passed as options to the shell used. For example on my machine the default shell (sh) has an -x option that will display on stderr all the processes that gets executed:
>>> from subprocess import Popen
>>> proc = Popen(['ps aux | grep python3', '-x'], shell=True)
>>>
username 7301 0.1 0.1 39440 7408 pts/9 S+ 12:57 0:00 python3
username 7302 0.0 0.0 4444 640 pts/9 S+ 12:58 0:00 /bin/sh -c ps aux | grep python3 -x
username 7304 0.0 0.0 15968 904 pts/9 S+ 12:58 0:00 grep python3
Here you can see that a /bin/sh was started that executed the command ps aux | python3 and with an option of -x.
(This is all documented in the documentation for Popen).
This said, one way to achieve what you want is to use subprocess.check_output:
subprocess.check_output("ps aux | grep 'java -jar' | grep -v grep | awk '//{print $2}'", shell=True)
However this isn't available in python<2.7 so you have to use Popen and communicate():
proc = subprocess.Popen("ps aux | grep 'java -jar' | grep -v grep | awk '//{print $2}'", shell=True, stdout=subprocess.PIPE)
out, err = proc.communicate()
The alternative is to avoid using shell=True (which is generally a very good thing, since shell=True introduces some security risks) and manually write the pipe using multiple processes:
from subprocess import Popen, PIPE
ps = Popen(['ps', 'aux'], stdout=PIPE)
grep_java = Popen(['grep', 'java -jar'], stdin=ps.stdout, stdout=PIPE)
grep_grep = Popen(['grep', '-v', 'grep'], stdin=grep_java.stdout, stdout=PIPE)
awk = Popen(['awk', '//{print $2}'], stdin=grep_grep.stdout, stdout=PIPE)
out, err = awk.communicate()
grep_grep.wait()
grep_java.wait()
ps.wait()
Note that if you don't care for the standard error you can avoid specifying it. It will then inherit the one of the current process.
There is just one command in the shell pipeline which can't be easily replaced by Python code. So you can start several external processes and connect their inputs and outputs, but I would just start the ps aux and add some Python code to filter and extract the desired data:
from subprocess import PIPE, Popen
def main():
process = Popen(['ps', 'aux'], stdout=PIPE)
pids = [
line.split(None, 2)[1] for line in process.stdout if 'java -jar' in line
]
process.wait()
print '\n'.join(pids)
if __name__ == '__main__':
main()
I want to open a process and run two commands in the same process. I have :
cmd1 = 'source /usr/local/../..'
cmd2 = 'ls -l'
final = Popen(cmd2, shell=True, stdin=PIPE, stdout=PIPE, stderr=STDOUT, close_fds=True)
stdout, nothing = final.communicate()
log = open('log', 'w')
log.write(stdout)
log.close()
If I use popen two times, these two commands will be executed in different processes. But I want them to run in the same shell.
The commands will always be two (unix) processes, but you can start them from one call to Popen and the same shell by using:
from subprocess import Popen, PIPE, STDOUT
cmd1 = 'echo "hello world"'
cmd2 = 'ls -l'
final = Popen("{}; {}".format(cmd1, cmd2), shell=True, stdin=PIPE,
stdout=PIPE, stderr=STDOUT, close_fds=True)
stdout, nothing = final.communicate()
log = open('log', 'w')
log.write(stdout)
log.close()
After running the program the file 'log' contains:
hello world
total 4
-rw-rw-r-- 1 anthon users 303 2012-05-15 09:44 test.py
Why is it that the subprocess pid (Popen.pid) has different value from that the ps command returns?
I've noticed this when ps called both from inside python (with subprocess.call()) and from another terminal.
Here's a simple python file to test:
#!/usr/bin/python3
'''
Test subprocess termination
'''
import subprocess
command = 'cat'
#keep pipes so that cat doesn't complain
proc = subprocess.Popen(command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
stdin=subprocess.PIPE,
shell=True)
print('pid = %d' % proc.pid)
subprocess.call("ps -A | grep -w %s" % command,
shell=True)
proc.terminate()
proc.wait() # make sure its dead before exiting pytyhon
Usually the pid reported by ps is 1 or 2 more than that reported by Popen.pid.
Because the command is run with shell=True, the pid returned by subprocess is that of the shell process used to run the command.