awk from Python: wrong subprocess arguments? - python

I need to run the following (working) command in Python:
ip route list dev eth0 | awk ' /^default/ {print $3}'
Using subprocess, I would have to do the following:
first = "ip route list dev eth0"
second = "awk ' /^default/ {print $3}'"
p1 = subprocess.Popen(first.split(), stdout=subprocess.PIPE)
p2 = subprocess.Popen(second.split(), stdin=p1.stdout, stdout=subprocess.PIPE)
p1.stdout.close() # Allow p1 to receive a SIGPIPE if p2 exits.
output = p2.communicate()[0]
Something went wrong with p2. I get:
>>> awk: cmd. line:1: '
awk: cmd. line:1: ^ invalid char ''' in expression
What should I do? On a terminal it works perfectly.

split splits on any whitespace, including that inside single-quoted arguments. If you really have to, use shlex.split:
import shlex
p2 = subprocess.Popen(shlex.split(second), stdin=p1.stdout, stdout=subprocess.PIPE)
However it usually makes more sense to specify the commands directly:
first = ['ip', 'route', 'list', 'dev', 'eth0']
second = ['awk', ' /^default/ {print $3}']
p1 = subprocess.Popen(first, stdout=subprocess.PIPE)
p2 = subprocess.Popen(second, stdin=p1.stdout, stdout=subprocess.PIPE)

Not the best solution, but while you are waiting for the best answer, you can still do this :
cmd = "ip route list dev eth0 | awk ' /^default/ {print $3}'"
p2 = subprocess.Popen(cmd, stdin=subprocess.PIPE, stdout=subprocess.PIPE, shell=True)

Related

How can I combine three subprocess.popen into one line?

I am currently using
p1 = subprocess.Popen(['ps', 'ax'], stdout=subprocess.PIPE)
p2 = subprocess.Popen(['grep', 'bash'], stdin=p1.stdout, stdout=subprocess.PIPE)
p3 = subprocess.Popen(['wc', '-l'], stdin=p2.stdout, stdout=subprocess.PIPE)
count = int(p3.stdout.read())
if count == 2:
print count
print "yes"
else:
print "fail"
For checking to see if my python is currently running already or not.
It works fine.
However, I want to know if I can combine p1 p2 and p3 into one line.
I tried
p = subprocess.Popen(['ps ax | grep bash | wc -l'], stdout=subprocess.PIPE)
and few more but its not working.Is there a way to combine all these 3 into one line?
You can only use shell-style pipes (|) if you pass the shell=True keyword argument to Popen:
p = subprocess.Popen('ps ax | grep bash | wc -l', stdout=subprocess.PIPE, shell=True)
Otherwise, the entire string you pass is treated like the name of the executable, which will fail, since you don't have a program named ps ax | grep bash | wc -l.
Additionally, you shouldn't pass the command as a list with shell=True, you just pass a string.
One other note: You probably need to adjust your command for this to work reliably, so that the grep bash command itself isn't counted in the output of ps ax | grep bash. One trick to do this is to use ps ax | grep [b]ash. Also, using shell=True might end up starting a bash instance to run the command, so you might need to use if count == 3 instead of if count == 2.
Putting it altogether:
p = subprocess.Popen('ps ax | grep [b]ash | wc -l', stdout=subprocess.PIPE, shell=True)
count = int(p.stdout.read())
if count == 2: # Or maybe 3?
print count
print "yes"
else:
print "fail"
Edit:
Here's output from running different versions of the code on my machine
>>> p1 = subprocess.Popen(['ps', 'ax'], stdout=subprocess.PIPE)
>>> p2 = subprocess.Popen(['grep', 'bash'], stdin=p1.stdout, stdout=subprocess.PIPE)
>>> p3 = subprocess.Popen(['wc', '-l'], stdin=p2.stdout, stdout=subprocess.PIPE); p3.stdout.read()
'42\n'
>>> p1 = subprocess.Popen(['ps', 'ax'], stdout=subprocess.PIPE)
>>> p2 = subprocess.Popen(['grep', 'bash'], stdin=p1.stdout, stdout=subprocess.PIPE)
>>> p3 = subprocess.Popen(['wc', '-l'], stdin=p2.stdout, stdout=subprocess.PIPE); p3.stdout.read()
'42\n'
>>> p1 = subprocess.Popen(['ps', 'ax'], stdout=subprocess.PIPE)
>>> p2 = subprocess.Popen(['grep', 'bash'], stdin=p1.stdout, stdout=subprocess.PIPE)
>>> p3 = subprocess.Popen(['wc', '-l'], stdin=p2.stdout, stdout=subprocess.PIPE); p3.stdout.read()
'42\n'
>>> p1 = subprocess.Popen(['ps', 'ax'], stdout=subprocess.PIPE)
>>> p2 = subprocess.Popen(['grep', 'bash'], stdin=p1.stdout, stdout=subprocess.PIPE)
>>> p3 = subprocess.Popen(['wc', '-l'], stdin=p2.stdout, stdout=subprocess.PIPE); p3.stdout.read()
'42\n'
>>> p1 = subprocess.Popen(['ps', 'ax'], stdout=subprocess.PIPE)
>>> p2 = subprocess.Popen(['grep', 'bash'], stdin=p1.stdout, stdout=subprocess.PIPE)
>>> p3 = subprocess.Popen(['wc', '-l'], stdin=p2.stdout, stdout=subprocess.PIPE); p3.stdout.read()
'42\n'
>>> p = subprocess.Popen('ps -ef | grep bash | wc -l ', stdout=subprocess.PIPE, shell=True) ; print(''.join(p.stdout.readlines()))
44
>>> p = subprocess.Popen('ps -ef | grep bash | wc -l ', stdout=subprocess.PIPE, shell=True) ; print(''.join(p.stdout.readlines()))
44
>>> p = subprocess.Popen('ps -ef | grep bash | wc -l ', stdout=subprocess.PIPE, shell=True) ; print(''.join(p.stdout.readlines()))
44
>>> p = subprocess.Popen('ps -ef | grep [b]ash | wc -l ', stdout=subprocess.PIPE, shell=True) ; print(''.join(p.stdout.readlines()))
42
>>> p = subprocess.Popen('ps -ef | grep [b]ash | wc -l ', stdout=subprocess.PIPE, shell=True) ; print(''.join(p.stdout.readlines()))
42
>>> p = subprocess.Popen('ps -ef | grep [b]ash | wc -l ', stdout=subprocess.PIPE, shell=True) ; print(''.join(p.stdout.readlines()))
42
Here is the obvious answer. All I did was replace the second p1 and p2 with the first assignment.
p3 = subprocess.Popen(['wc', '-l'], stdin=subprocess.Popen(['grep', 'bash'], \
stdin=subprocess.Popen(['ps', 'ax'], stdout=subprocess.PIPE).stdout, \
stdout=subprocess.PIPE).stdout, stdout=subprocess.PIPE)

Multiple line shell commands using Python commands module

I'm trying to write a Python function that transforms a given coordinate system to another using gdal. Problem is that I'm trying to execute the command as one string, but in shell, I have to press enter before entering the coordinates.
x = 1815421
y = 557301
ret = []
tmp = commands.getoutput( 'gdaltransform -s_srs \'+proj=lcc +lat_1=34.03333333333333
+lat_2=35.46666666666667 +lat_0=33.5 +lon_0=-118 +x_0=2000000 +y_0=500000 +ellps=GRS80
+units=m +no_defs\' -t_srs epsg:4326 \n' + str(x) + ' ' + str(y) )
I tried it using '\n', but that doesn't work.
My guess is that you run gdaltransform by pressing Enter and the coordinates are read by the program itself from its stdin, not the shell:
from subprocess import Popen, PIPE
p = Popen(['gdaltransform', '-s_srs', ('+proj=lcc '
'+lat_1=34.03333333333333 '
'+lat_2=35.46666666666667 '
'+lat_0=33.5 '
'+lon_0=-118 +x_0=2000000 +y_0=500000 +ellps=GRS80 '
'+units=m +no_defs'), '-t_srs', 'epsg:4326'],
stdin=PIPE, stdout=PIPE, universal_newlines=True) # run the program
output = p.communicate("%s %s\n" % (x, y))[0] # pass coordinates
from subprocess import *
c = 'command 1 && command 2 && command 3'
# for instance: c = 'dir && cd C:\\ && dir'
handle = Popen(c, stdin=PIPE, stderr=PIPE, stdout=PIPE, shell=True)
print handle.stdout.read()
handle.flush()
If i'm not mistaken, the commands will be executed over a "session" and thus keeping whatever niformation you need in between the commands.
More correctly, using shell=True (from what i've been tought) is that it's supposed to be used if given a string of commands rather than a list. If you'd like to use a list suggestions are to do as follows:
import shlex
c = shlex.split("program -w ith -a 'quoted argument'")
handle = Popen(c, stdout=PIPE, stderr=PIPE, stdin=PIPE)
print handle.stdout.read()
And then catch the output, Or you could work with a open stream and use handle.stdin.write() but it's a bit tricky.
Unless you only want to execute, read and die, .communicate() is perfect, or just .check_output(<cmd>)
Good information n how Popen works can be found here (altho different topic): python subprocess stdin.write a string error 22 invalid argument
Solution
Anyway, this should work (you have to redirect STDIN and STDOUT):
from subprocess import *
c = 'gdaltransform -s_srs \'+proj=lcc +lat_1=34.03333333333333 +lat_2=35.46666666666667 +lat_0=33.5 +lon_0=-118 +x_0=2000000 +y_0=500000 +ellps=GRS80 +units=m +no_defs\' -t_srs epsg:4326 \n' + str(x) + ' ' + str(y) + '\n'
handle = Popen(c, stdin=PIPE, stderr=PIPE, stdout=PIPE, shell=True)
print handle.stdout.read()
handle.flush()

Python subprocess: how to use pipes thrice? [duplicate]

This question already has answers here:
How do I use subprocess.Popen to connect multiple processes by pipes?
(9 answers)
Closed 8 years ago.
I'd like to use subprocess on the following line:
convert ../loxie-orig.png bmp:- | mkbitmap -f 2 -s 2 -t 0.48 | potrace -t 5 --progress -s -o ../DSC00232.svg
I found thank to other posts the subprocess documentation but in the example we use only twice pipe.
So, I try for two of the three commands and it works
p1 = subprocess.Popen(['convert', fileIn, 'bmp:-'], stdout=subprocess.PIPE)
# p2 = subprocess.Popen(['mkbitmap', '-f', '2', '-s', '2', '-t', '0.48'], stdout=subprocess.PIPE)
p3 = subprocess.Popen(['potrace', '-t' , '5', '-s' , '-o', fileOut], stdin=p1.stdout,stdout=subprocess.PIPE)
p1.stdout.close() # Allow p1 to receive a SIGPIPE if p3 exits.
output = p3.communicate()[0]
Can you help me for the third command?
Thank you very much.
Just add a third command following the same example:
p1 = subprocess.Popen(['convert', fileIn, 'bmp:-'], stdout=subprocess.PIPE)
p2 = subprocess.Popen(['mkbitmap', '-f', '2', '-s', '2', '-t', '0.48'],
stdin=p1.stdout, stdout=subprocess.PIPE)
p1.stdout.close()
p3 = subprocess.Popen(['potrace', '-t' , '5', '-s' , '-o', fileOut],
stdin=p2.stdout,stdout=subprocess.PIPE)
p2.stdout.close()
output = p3.communicate()[0]
def runPipe(cmds):
try:
p1 = subprocess.Popen(cmds[0].split(' '), stdin = None, stdout = subprocess.PIPE, stderr = subprocess.PIPE)
prev = p1
for cmd in cmds[1:]:
p = subprocess.Popen(cmd.split(' '), stdin = prev.stdout, stdout = subprocess.PIPE, stderr = subprocess.PIPE)
prev = p
stdout, stderr = p.communicate()
p.wait()
returncode = p.returncode
except Exception, e:
stderr = str(e)
returncode = -1
if returncode == 0:
return (True, stdout.strip().split('\n'))
else:
return (False, stderr)
Then execute it like:
runPipe(['ls -1','head -n 2', 'head -n 1'])
Use subprocess.Popen() with the option shell=True, and you can pass it your entire command as a single string.
This is the simplest solution and makes it possible to embed a complicated pipeline in python without head-scratching; but in some cases it might not work, e.g. (as #torek commented) if there are spaces in the filenames passed for input or output. In that case, take the trouble to build up the robust solution in the accepted answer.

link several Popen commands with pipes

I know how to run a command using cmd = subprocess.Popen and then subprocess.communicate.
Most of the time I use a string tokenized with shlex.split as 'argv' argument for Popen.
Example with "ls -l":
import subprocess
import shlex
print subprocess.Popen(shlex.split(r'ls -l'), stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE).communicate()[0]
However, pipes seem not to work... For instance, the following example returns noting:
import subprocess
import shlex
print subprocess.Popen(shlex.split(r'ls -l | sed "s/a/b/g"'), stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE).communicate()[0]
Can you tell me what I am doing wrong please?
Thx
I think you want to instantiate two separate Popen objects here, one for 'ls' and the other for 'sed'. You'll want to pass the first Popen object's stdout attribute as the stdin argument to the 2nd Popen object.
Example:
p1 = subprocess.Popen('ls ...', stdout=subprocess.PIPE)
p2 = subprocess.Popen('sed ...', stdin=p1.stdout, stdout=subprocess.PIPE)
print p2.communicate()
You can keep chaining this way if you have more commands:
p3 = subprocess.Popen('prog', stdin=p2.stdout, ...)
See the subprocess documentation for more info on how to work with subprocesses.
I've made a little function to help with the piping, hope it helps. It will chain Popens as needed.
from subprocess import Popen, PIPE
import shlex
def run(cmd):
"""Runs the given command locally and returns the output, err and exit_code."""
if "|" in cmd:
cmd_parts = cmd.split('|')
else:
cmd_parts = []
cmd_parts.append(cmd)
i = 0
p = {}
for cmd_part in cmd_parts:
cmd_part = cmd_part.strip()
if i == 0:
p[i]=Popen(shlex.split(cmd_part),stdin=None, stdout=PIPE, stderr=PIPE)
else:
p[i]=Popen(shlex.split(cmd_part),stdin=p[i-1].stdout, stdout=PIPE, stderr=PIPE)
i = i +1
(output, err) = p[i-1].communicate()
exit_code = p[0].wait()
return str(output), str(err), exit_code
output, err, exit_code = run("ls -lha /var/log | grep syslog | grep gz")
if exit_code != 0:
print "Output:"
print output
print "Error:"
print err
# Handle error here
else:
# Be happy :D
print output
shlex only splits up spaces according to the shell rules, but does not deal with pipes.
It should, however, work this way:
import subprocess
import shlex
sp_ls = subprocess.Popen(shlex.split(r'ls -l'), stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE)
sp_sed = subprocess.Popen(shlex.split(r'sed "s/a/b/g"'), stdin = sp_ls.stdout, stdout = subprocess.PIPE, stderr = subprocess.PIPE)
sp_ls.stdin.close() # makes it similiar to /dev/null
output = sp_ls.communicate()[0] # which makes you ignore any errors.
print output
according to help(subprocess)'s
Replacing shell pipe line
-------------------------
output=`dmesg | grep hda`
==>
p1 = Popen(["dmesg"], stdout=PIPE)
p2 = Popen(["grep", "hda"], stdin=p1.stdout, stdout=PIPE)
output = p2.communicate()[0]
HTH
"""
Why don't you use shell
"""
def output_shell(line):
try:
shell_command = Popen(line, stdout=PIPE, stderr=PIPE, shell=True)
except OSError:
return None
except ValueError:
return None
(output, err) = shell_command.communicate()
shell_command.wait()
if shell_command.returncode != 0:
print "Shell command failed to execute"
return None
return str(output)
Thank #hernvnc, #glglgl, and #Jacques Gaudin for the answers. I fixed the code from #hernvnc. His version will cause hanging in some scenarios.
import shlex
from subprocess import PIPE
from subprocess import Popen
def run(cmd, input=None):
"""Runs the given command locally and returns the output, err and exit_code."""
if "|" in cmd:
cmd_parts = cmd.split('|')
else:
cmd_parts = []
cmd_parts.append(cmd)
i = 0
p = {}
for cmd_part in cmd_parts:
cmd_part = cmd_part.strip()
if i == 0:
if input:
p[i]=Popen(shlex.split(cmd_part),stdin=PIPE, stdout=PIPE, stderr=PIPE)
else:
p[i]=Popen(shlex.split(cmd_part),stdin=None, stdout=PIPE, stderr=PIPE)
else:
p[i]=Popen(shlex.split(cmd_part),stdin=p[i-1].stdout, stdout=PIPE, stderr=PIPE)
i = i +1
# close the stdin explicitly, otherwise, the following case will hang.
if input:
p[0].stdin.write(input)
p[0].stdin.close()
(output, err) = p[i-1].communicate()
exit_code = p[0].wait()
return str(output), str(err), exit_code
# test case below
inp = b'[ CMServer State ]\n\nnode node_ip instance state\n--------------------------------------------\n1 linux172 10.90.56.172 1 Primary\n2 linux173 10.90.56.173 2 Standby\n3 linux174 10.90.56.174 3 Standby\n\n[ ETCD State ]\n\nnode node_ip instance state\n--------------------------------------------------\n1 linux172 10.90.56.172 7001 StateFollower\n2 linux173 10.90.56.173 7002 StateLeader\n3 linux174 10.90.56.174 7003 StateFollower\n\n[ Cluster State ]\n\ncluster_state : Normal\nredistributing : No\nbalanced : No\ncurrent_az : AZ_ALL\n\n[ Datanode State ]\n\nnode node_ip instance state | node node_ip instance state | node node_ip instance state\n------------------------------------------------------------------------------------------------------------------------------------------------------------------------\n1 linux172 10.90.56.172 6001 P Standby Normal | 2 linux173 10.90.56.173 6002 S Primary Normal | 3 linux174 10.90.56.174 6003 S Standby Normal'
cmd = "grep -E 'Primary' | tail -1 | awk '{print $3}'"
run(cmd, input=inp)

How to get output from external command combine with Pipe

I have command like this.
wmctrl -lp | awk '/gedit/ { print $1 }'
And I want its output within python script, i tried this code
>>> import subprocess
>>> proc = subprocess.Popen(["wmctrl -lp", "|","awk '/gedit/ {print $1}"], shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
>>> proc.stdout.readline()
'0x0160001b -1 6504 beer-laptop x-nautilus-desktop\n'
>>> proc.stdout.readline()
'0x0352f117 0 6963 beer-laptop How to get output from external command combine with Pipe - Stack Overflow - Chromium\n'
>>> proc.stdout.readline()
'0x01400003 -1 6503 beer-laptop Bottom Expanded Edge Panel\n'
>>>
It seem my code is wrong only wmctrl -lp was execute, and | awk '{print $1}' is omitted
My expect output would like 0x03800081
$ wmctrl -lp | awk '/gedit/ {print $1}'
0x03800081
Does one please help.
With shell=True, you should use a single command line instead of an array, otherwise your additional arguments are interpreted as shell arguments. From the subprocess documentation:
On Unix, with shell=True: If args is a string, it specifies the command string to execute through the shell. If args is a sequence, the first item specifies the command string, and any additional items will be treated as additional shell arguments.
So your call should be:
subprocess.Popen("wmctrl -lp | sed /gedit/ '{print $1}'", shell=True, ...
I think you may also have an unbalanced single quote in there.
Because you are passing a sequence in for the program, it thinks that the pipe is an argument to wmcrtrl, such as if you did
wmctrl -lp "|"
and thus the actual pipe operation is lost.
Making it a single string should indeed give you the correct result:
>>> import subprocess as s
>>> proc = s.Popen("echo hello | grep e", shell=True, stdout=s.PIPE, stderr=s.PIPE)
>>> proc.stdout.readline()
'hello\n'
>>> proc.stdout.readline()
''
After some research, I have the following code which works very well for me. It basically prints both stdout and stderr in real time. Hope it helps someone else who needs it.
stdout_result = 1
stderr_result = 1
def stdout_thread(pipe):
global stdout_result
while True:
out = pipe.stdout.read(1)
stdout_result = pipe.poll()
if out == '' and stdout_result is not None:
break
if out != '':
sys.stdout.write(out)
sys.stdout.flush()
def stderr_thread(pipe):
global stderr_result
while True:
err = pipe.stderr.read(1)
stderr_result = pipe.poll()
if err == '' and stderr_result is not None:
break
if err != '':
sys.stdout.write(err)
sys.stdout.flush()
def exec_command(command, cwd=None):
if cwd is not None:
print '[' + ' '.join(command) + '] in ' + cwd
else:
print '[' + ' '.join(command) + ']'
p = subprocess.Popen(
command, stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=cwd
)
out_thread = threading.Thread(name='stdout_thread', target=stdout_thread, args=(p,))
err_thread = threading.Thread(name='stderr_thread', target=stderr_thread, args=(p,))
err_thread.start()
out_thread.start()
out_thread.join()
err_thread.join()
return stdout_result + stderr_result
When needed, I think it's easy to collect the output or error in a string and return.

Categories