Processing output from cmdline via a Python script - python

I'm trying to use the subprocess module with Python 2.6 in order to run a command and get its output. The command is typically ran like this:
/usr/local/sbin/kamctl fifo profile_get_size myprofile | awk -F ':: ' '{print $2}'
What's the best way to use the subprocess module in my script to execute that command with those arguments and get the return value from the command? I'm using Python 2.6.

Do you want the output, the return value (AKA status code), or both?
If the amount of data emitted by the pipeline on stdout and/or stderr is not too large, it's pretty simple to get "all of the above":
import subprocess
s = """/usr/local/sbin/kamctl fifo profile_get_size myprofile | awk -F ':: ' '{print $2}'"""
p = subprocess.Popen(s, shell=True, stdout=subprocess.PIPE)
out, err = p.communicate()
print 'out: %r' % out
print 'err: %r' % err
print 'status: %r' % p.returncode
If you have to deal with potentially huge amounts of output, it takes a bit more code -- doesn't look like you should have that problem, judging from the pipeline in question.

f.e. stdout you can get like this:
>>> import subprocess
>>> process = subprocess.Popen("echo 'test'", shell=True, stdout=subprocess.PIPE)
>>> process.wait()
0
>>> process.stdout.read()
'test\n'

Related

Running a complex command line in python

I would like to call a complex command line in Python and capture its output, and I don't understand how I should be doing it:
Command line that I'm trying to run is:
cat codegen_query_output.json | jq -r '.[0].code' | echoprint-inverted-query index.bin
As far as I got is:
process = subprocess.Popen(['ls', '-a'], stdout=subprocess.PIPE)
out, err = process.communicate()
print out
but this is a simple ls -a ([cmd, args]) any idea how should I run/structure my complex command line call?
The cleanest way is to create 2 subprocesses piped together. You don't need a subprocess for the cat command, just pass an opened file handle:
import subprocess
with open("codegen_query_output.json") as input_stream:
jqp = subprocess.Popen(["jq","-r",'.[0].code'],stdin=input_stream,stdout=subprocess.PIPE)
ep = subprocess.Popen(["echoprint-inverted-query","index.bin"],stdin=jqp.stdout,stdout=subprocess.PIPE)
output = ep.stdout.read()
return_code = ep.wait() or jqp.wait()
The jqp process takes the file contents as input. Its output is passed to ep input.
In the end we read output from ep to get the final result. The return_code is a combination of both return codes. If something goes wrong, it's different from 0 (more detailed return code info would be to test separately of course)
Standard error isn't considered here. It will be displayed to the console, unless stderr=subprocess.STDOUT is set (to merge with piped output)
This method doesn't require a shell or shell=True, it's then more portable and secure.
It takes a shell to interpret operators like |. You can ask Python to run a shell, and pass your command as the thing to execute:
cmd = "cat test.py | tail -n3"
process = subprocess.Popen(['bash', '-c', cmd], stdout=subprocess.PIPE)
out, err = process.communicate()
print out

Cant use python subprocess to get return value from curl call

So I am trying to write a simple wrapper in python to call rasa, a nlu tool from. The command I would write on the the command line is this:
curl -X POST "localhost:5000/parse" -d '{"q":"I am looking for fucking Mexican food"}' | python -m json.tool
The output I expect is something like this:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 545 0 500 100 45 33615 3025 --:--:-- --:--:-- --:--:-- 35714
plus the outprint of a json file.
I wrote this program in python:
import subprocess
utterance = "Lets say something"
result = subprocess.run(["curl", "-X", "POST", "localhost:5000/parse", "-d", "'{\"q\":\""+utterance+"\"}'", "|", "python", "-m", "json.tool"], stdout=subprocess.PIPE)
print(vars(result))
print(result.stdout.decode('utf-8'))
Unfortunately my output is like this, meaning I dont actually get the return from the curl call:
{'args': ['curl', '-X', 'POST', 'localhost:5000/parse', '-d', '\'{"q":"Lets say something"}\'', '|', 'python', '-m', 'json.tool'], 'returncode': 2, 'stdout': b'', 'stderr': None}
If I call my python programm from the commandline, this is the output:
curl: option -m: expected a proper numerical parameter
curl: try 'curl --help' or 'curl --manual' for more information
{'args': ['curl', '-X', 'POST', 'localhost:5000/parse', '-d', '\'{"q":"Lets say something"}\'', '|', 'python', '-m', 'json.tool'], 'returncode': 2, 'stdout': b'', 'stderr': None}
I tried looking everywhere but just cant get it going. Would really appreciate some help.
Update: I grossly misunderstood the question the first time. Rushed reading the details, so my apologies there. You are having a problem because you are trying to pipe two commands together using Popen. The pipe operator, however, is something implemented by the shell, not python. So it is expecting your command to just be a command related to curl. It is complicated, but you have options.
I think for your particular example, the simplest is to not try to chain the command to json.tool. You actually have no need for it. You are already in python, so you can just pretty print the output you get from curl yourself. Using python would look something like
import json
import shlex
from subprocess import Popen, PIPE
command = 'curl -XGET http://localhost:9200'
p = Popen(shlex.split(command), stdin=PIPE, stdout=PIPE, stderr=PIPE)
output, err = p.communicate()
if p.returncode != 0:
print(err)
j = json.loads(output.decode("utf-8"))
print(json.dumps(j, indent=4, sort_keys=True))
However, if what you want long term is to actually connect multiple processes with pipes, well there it depends on the scenario. The easiest method is to pass shell=True to Popen and pass the exact command (not a list of arguments). This delegates everything to the shell. I need to warn you that this is very exploitable when the command is based off user input. Both 2.x pipes.quote() and 3.x shlex.quote() have a recommendation of how to escape the command so it should be safe.
from subprocess import Popen, PIPE
command = 'curl -XGET http://localhost:9200 | python -m json.tool'
p = Popen(command, stdin=PIPE, stdout=PIPE, stderr=PIPE, shell=True)
output, err = p.communicate()
if p.returncode != 0:
print(err)
print(output.decode("utf-8"))
So if you find yourself needing to connect processes but have something based on user input, you can use multiple processes and connect them yourself.
import shlex
from subprocess import Popen, PIPE
command1 = 'curl -XGET http://localhost:9200'
command2 = 'python -m json.tool'
p1 = Popen(shlex.split(command1), stdout=PIPE)
p2 = Popen(shlex.split(command2), stdin=p1.stdout, stdout=PIPE)
p1.stdout.close()
output, err = p2.communicate()
if p2.returncode != 0:
print(err)
print(output.decode("utf-8"))
This question has a bunch more on the topic if you are curious.

How to avoid passing shell constructs to executable using Popen

I am trying to call an executable called foo, and pass it some command line arguments. An external script calls into the executable and uses the following command:
./main/foo --config config_file 2>&1 | /usr/bin/tee temp.log
The script uses Popen to execute this command as follows:
from subprocess import Popen
from subprocess import PIPE
def run_command(command, returnObject=False):
cmd = command.split(' ')
print('%s' % cmd)
p = None
print('command : %s' % command)
if returnObject:
p = Popen(cmd)
else:
p = Popen(cmd)
p.communicate()
print('returncode: %s' % p.returncode)
return p.returncode
return p
command = "./main/foo --config config_file 2>&1 | /usr/bin/tee temp.log
"
run_command(command)
However, this passes extra arguments ['2>&1', '|', '/usr/bin/tee', 'temp.log'] to the foo executable.
How can I get rid of these extra arguments getting passed to foo while maintaining the functionality?
I have tried shell=True but read about avoiding it for security purposes (shell injection attack). Looking for a neat solution.
Thanks
UPDATE:
- Updated the file following the tee command
The string
./main/foo --config config_file 2>&1 | /usr/bin/tee >temp.log
...is full of shell constructs. These have no meaning to anything without a shell in play. Thus, you have two options:
Set shell=True
Replace them with native Python code.
For instance, 2>&1 is the same thing as passing stderr=subprocess.STDOUT to Popen, and your tee -- since its output is redirected and it's passed no arguments -- could just be replaced with stdout=open('temp.log', 'w').
Thus:
p = subprocess.Popen(['./main/foo', '--config', 'config_file'],
stderr=subprocess.STDOUT,
stdout=open('temp.log', 'w'))
...or, if you really did want the tee command, but were just using it incorrectly (that is, if you wanted tee temp.log, not tee >temp.log):
p1 = subprocess.Popen(['./main/foo', '--config', 'config_file'],
stderr=subprocess.STDOUT,
stdout=subprocess.PIPE)
p2 = subprocess.Popen(['tee', 'temp.log'], stdin=p1.stdout)
p1.stdout.close() # drop our own handle so p2's stdin is the only handle on p1.stdout
stdout, _ = p2.communicate()
Wrapping this in a function, and checking success for both ends might look like:
def run():
p1 = subprocess.Popen(['./main/foo', '--config', 'config_file'],
stderr=subprocess.STDOUT,
stdout=subprocess.PIPE)
p2 = subprocess.Popen(['tee', 'temp.log'], stdin=p1.stdout)
p1.stdout.close() # drop our own handle so p2's stdin is the only handle on p1.stdout
# True if both processes were successful, False otherwise
return (p2.wait() == 0 && p1.wait() == 0)
By the way -- if you want to use shell=True and return the exit status of foo, rather than tee, things get a bit more interesting. Consider the following:
p = subprocess.Popen(['bash', '-c', 'set -o pipefail; ' + command_str])
...the pipefail bash extension will force the shell to exit with the status of the first pipeline component to fail (and 0 if no components fail), rather than using only the exit status of the final component.
Here's a couple of "neat" code examples in addition to the explanation from #Charles Duffy answer.
To run the shell command in Python:
#!/usr/bin/env python
from subprocess import check_call
check_call("./main/foo --config config_file 2>&1 | /usr/bin/tee temp.log",
shell=True)
without the shell:
#!/usr/bin/env python
from subprocess import Popen, PIPE, STDOUT
tee = Popen(["/usr/bin/tee", "temp.log"], stdin=PIPE)
foo = Popen("./main/foo --config config_file".split(),
stdout=tee.stdin, stderr=STDOUT)
pipestatus = [foo.wait(), tee.wait()]
Note: don't use "command arg".split() with non-literal strings.
See How do I use subprocess.Popen to connect multiple processes by pipes?
You may combine answers to two StackOverflow questions:
1. piping together several subprocesses
x | y problem
2. Merging a Python script's subprocess' stdout and stderr (while keeping them distinguishable)
2>&1 problem

Python test if client exist

I am new to python and working on trying to make a script which checks if a specified host as for example sensu-client exist. I use a deployment software called NSO and run it by: nso status and it shows me this information:
nagios-client host nagios-client down
test host test down
Is there any possibility to make a script to check if for example nagios-Client exist with a script ?
In shell I do it by:
nso status | awk '{ print $1 }'
In this case I would suggest using subprocess' check_output function. The documentation is here. check_output can return, as a string the shell output of a command. So you would have something like this:
import subprocess
foo=subprocess.check_output(['nso', 'status', '|', 'awk', '\'{ print $1 }\''], shell=True)
#Thanks bereal for shell=True
print foo
Of course, if your only targeting linux, you could use the much easier sh module. It allows you to import programs as if they were libraries.
you can use subprocess to run this command and parse the output
import subprocess
command = ['nso', 'status', '|', 'awk', '\'{ print $1 }\'']
p1 = subprocess.Popen(command, stdout=subprocess.PIPE)
You don't have to run awk, since you're already in Python:
import subprocess
proc = subprocess.Popen(['nso', 'status'], stdout=subprocess.PIPE)
# get stdout as a EOL-separated string, ignore stderr for now
out, _ = proc.communicate()
# parse the output, line.split()[0] is awk's $1
items = [line.split()[0] for line in out.split('\n')]

running bash command from python shell

I want to run a bash command from python shell.
my bash is:
grep -Po "(?<=<cite>).*?(?=</cite>)" /tmp/file1.txt | awk -F/ '{print $1}' | awk '!x[$0]++' > /tmp/file2.txt
what I tried is:
#!/usr/bin/python
import commands
commands.getoutput('grep ' + '-Po ' + '\"\(?<=<dev>\).*?\(?=</dev>\)\" ' + '/tmp/file.txt ' + '| ' + 'awk \'!x[$0]++\' ' + '> ' + '/tmp/file2.txt')
But I don't have any result.
Thank you
If you want to avoid splitting your arguments and worrying about pipes, you can use the shell=True option:
cmd = "grep -Po \"(?<=<dev>).*?(?=</dev>)\" /tmp/file.txt | awk -F/ '{print $1}' | awk '!x[$0]++' > file2.txt"
out = subprocess.check_output(cmd, shell=True)
This will run a subshell which will understands all your directives, including "|" for piping, ">" for redirection. If you do not do this, these symbols normally parsed by the shell will just be passed to grep program.
Otherwise, you have to create the pipes yourself. For example (untested code below):
grep_p = subprocess.Popen(["grep", "-Po", "(?<=<dev>).*?(?=</dev>)", "/tmp/file.txt"], stdout=subprocess.PIPE)
awk_p = subprocess.Popen(["awk", "-F/", "'{print $1}'"], stdin = grep_p.stdout)
file2_fh = open("file2.txt", "w")
awk_p_2 = subprocess.Popen(["awk", "!x[$0]++", stdout = file2_fh, stdin = awk_p.stdout)
awk_p_2.communicate()
However, you're missing the point of python if you are doing this. You should instead look into the re module: re.match, re.sub, re.search, though I'm not familiar enough with awk to translate your commands.
The recommend way to run system commands in python is to use the module subprocess.
import subprocess
a=['grep' ,'-Po', '"(?<=<dev>).*?(?=</dev>)"','/tmp/file.txt']
b=['awk', '-F/', '"{print $1}"']
c=["awk", '"!x[$0]++"']
p1 = subprocess.Popen(a,stdout=subprocess.PIPE)
p2 = subprocess.Popen(b,stdin=p1.stdout,stdout=subprocess.PIPE)
p3 = subprocess.Popen(c,stdin=p2.stdout,stdout=subprocess.PIPE)
p1.stdout.close()
p2.stdout.close()
out,err=p3.communicate()
print out
The point of creating pipes between each subprocess is for security and debugging reasons. Also it makes the code much clearer in terms, which process gets input and sends output to.
Let us write a simple function to easily deal with these messy pipes for us:
def subprocess_pipes (pipes, last_pipe_out = None):
import subprocess
from subprocess import PIPE
last_p = None
for cmd in pipes:
out_pipe = PIPE if not (cmd==pipes[-1] and last_pipe_out) else open(last_pipe_out, "w")
cmd = cmd if isinstance(cmd, list) else cmd.split(" ")
in_pipe = last_p.stdout if last_p else None
p = subprocess.Popen(cmd, stdout = out_pipe, stdin = in_pipe)
last_p = p
comm = last_p.communicate()
return comm
Then we run,
subprocess_pipes(("ps ax", "grep python"), last_pipe_out = "test.out.2")
The result is a "test.out.2" file with the contents of piping "ps ax" into "grep python".
In your case,
a = ["grep", "-Po", "(?<=<cite>).*?(?=</cite>)", "/tmp/file1.txt"]
b = ["awk", "-F/", "{print $1}"]
c = ["awk", "!x[$0]++"]
subprocess_pipes((a, b, c), last_pipe_out = "/tmp/file2.txt")
The commands module is obsolete now.
If you don't actually need the output of your command you can use
import os
exit_status = os.system("your-command")
Otherwise you can use
import suproccess
out, err = subprocess.Popen("your | commands", stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell = True).communicate()
Note: for your command you send stdout to file2.txt so I wouldn't expect to see anything in out you will however still see error messages on stderr which will go into err
you must use
import os
os.system(command)
I think what you are looking for is something like:
ubprocess.check_output(same as popen arguments, **kwargs) , use it the same way you would use a popen command , it should show you the output of the program that's being called.
For more details here is a link: http://freefilesdl.com/how-to-call-a-shell-command-from-python/

Categories