I have this commands in bash:
ACTIVE_MGMT_1=ssh -n ${MGMT_IP_1} ". .bash_profile; xms sho proc TRAF.*" 2>/dev/null |egrep " A " |awk '/TRAF/{print $1}' |cut -d "." -f2;
I was trying to do it in Python like this:
active_mgmgt_1 = os.popen("""ssh -n MGMT_IP_1 ". .bash_profile; xms sho proc TRAF.*" 2>/dev/null |egrep " A " |awk '/TRAF/{print $1}' |cut -d "." -f2""") ACTIVE_MGMT_1 = active_mgmgt_1.read().replace('\n', '')
It doesn't work; any advice please?
Your popen call needs be set to communicate via a pipe.
Also stop trying to put everything on one line - python doesn't require it and places a lot of empasis on readable code.
I would strongly suggest doing the string processing in python rather than egrep, (use find or re in python), awk (find or egrep) and cut (string split).
It is also recommended to use subprocess.Popen rather than os.popen functions. There is a suggestion to use shlex.spilt to clear up this sort of issue.
untested code
import subprocess
import re
import os
MGMT_IP_1 = os.getenv('MGMT_IP_1')
sp = subprocess.Popen(
['ssh', '-n', MGMT_IP_1, '. .bash_profile; xms sho proc TRAF.*'],
stdout=PIPE, stderr=None)
(result, outtext) = sp.communicate()
# Proceed to process outtext from here using re, find and split
# to the equivalent of egrep " A " |awk '/TRAF/{print $1}' |cut -d "." -f2;
Related
I'm trying to get the following command to be called from a python script and to get the output of the command: awk -F':' '$1 == "VideoEdge" {print $2, $3, $8}' /etc/shadow
I've got the function working using subprocess.check_output and .Popen in a python shell but when called from a script it doesn't work and causes an exception of which has no apparent output or message.
How can I get this command working from a script?
I've tried using check_output, Popen and shlex to help with possible issues I thought were causing the issue. Code works fine in a shell.
temp = "User"
cmd = "awk -F':' '$1 == \"" + temp + "\" {print $2, $3, $8}' /etc/shadow"
cmdOutput = subprocess.check_output(shlex.split(cmd))
print cmdOutput
temp = "User"
cmd = "awk -F':' '$1 == \"" + temp + "\" {print $2, $3, $8}' /etc/shadow"
cmdOutput = subprocess.Popen(cmd, stdin=PIPE, stdout=PIPE, stderr=PIPE)
print cmdOutput.communicate()[0]
I'd just do. [Just for user consideration, (it will be cryptic, in the comments)]
user = "someuser"
with open('/etc/shadow') as f:
for line in f:
if line.startswith(user):
data = line.split(':')
break
print(data)
Make the same shlex.split(cmd) and it will work:
cmdOutput = subprocess.Popen(shlex.split(cmd), stdin=PIPE, stdout=PIPE, stderr=PIPE)
I had a permission issue with the file i was performing the command. bangs head against desk
I am trying to run the following awk command inside python but I get a syntax error related to the quotes:
import subprocess
COMMAND = "df /dev/sda1 | awk /'NR==2 {sub("%","",$5); if ($5 >= 80) {printf "Warning! Space usage is %d%%", $5}}"
subprocess.call(COMMAND, shell=True)
I tried to escape the quotes but I am still getting the same error.
You may want to put ''' or """ around the string since you have both ' and ".
import subprocess
COMMAND = '''"df /dev/sda1 | awk /'NR==2 {sub("%","",$5); if ($5 >= 80) {printf "Warning! Space usage is %d%%", $5}}"'''
subprocess.call(COMMAND, shell=True)
There also seems to be a relevant answer already for this as well: awk commands within python script
Try this:
import subprocess
COMMAND="df /dev/sda1 | awk 'NR==2 {sub(\"%\",\"\",$5); if ($5 >= 80) {printf \"Warning! Space usage is %d%%\", $5}}'"
subprocess.Popen(COMMAND,stdin=subprocess.PIPE,stdout=subprocess.PIPE, shell=True).stdout.read()
I was writing a python script for my deployment purpose and one part of the script was to explicitely kill the process if its not stopped successfully.
Below is the python code which actually performs
Find the processId of the process named myApplication
ps -ef | grep myApplication | grep -v grep | awk {'print $2'}
and then perform
kill -9 PID //where PID is output of earlier command
import subprocess
import signal
def killApplicationProcessIfStillRunning(app_name):
p1 = subprocess.Popen(['ps', '-ef'], stdout=subprocess.PIPE)
p2 = subprocess.Popen(['grep', app_name],stdin=p1.stdout, stdout=subprocess.PIPE)
p3 = subprocess.Popen(['grep', '-v' , 'grep'],stdin=p2.stdout, stdout=subprocess.PIPE)
p4 = subprocess.Popen(['awk', '{print $2}'],stdin=p3.stdout, stdout=subprocess.PIPE)
out, err = p4.communicate()
if out:
print 'Attempting to kill '+app_name +' process with PID ' +out.splitlines()[0]
os.kill(int(out.splitlines()[0]),signal.SIGKILL)
Now invoke the above method as
killApplicationProcessIfStillRunning(myApplication)
Hope it helps someone.
I am trying to read the filename and filestamp for the most recent files of each of the two naming schemes as seen in the code. I have the following code, roughly:
#!/usr/bin/env python
import string, subprocess, sys, os
mypath = "/path/to/file"
my_cmd = (["ls -lt --full-time " + mypath + "*DAI*.txt",
"ls -lt --full-time " + mypath + "*CA*.txt"]
)
getmostrecent_cmd = "head -n 1"
getcols_cmd = "awk '{ print $6, $7, $9 }'"
for cmd in my_cmd:
p1 = subprocess.Popen(cmd.split(), stdout=subprocess.PIPE)
p2 = subprocess.Popen(getmostrecent_cmd.split(), stdin=p1.stdout, stdout=subprocess.PIPE)
p3 = subprocess.Popen(getcols_cmd.split(), stdin=p2.stdout, stdout=subprocess.PIPE)
output = p3.communicate()[0]
print output
which give me the following error(s):
ls: cannot access /path/to/file/*DAI*.txt: No such file or directory
awk: '{
awk: ^ invalid char ''' in expression
ls: cannot access /path/to/file/*CA*.txt: No such file or directory
awk: '{
awk: ^ invalid char ''' in expression
But:
I can use "ls -lt --full-time /path/to/file/*DAI*.txt" and get a result in the terminal. Why is it causing an issue with the same path?
The awk command, when put in to subprocess directly, works fine; E.g. subprocess.Popen(["awk", ....], stdin=...., stdout=....) worked okay. But now I am getting an issue with the single quote. I tried triple quoting the string and escaping the single-quote.
I can use "ls -lt --full-time /path/to/file/DAI.txt" and get a
result in the terminal. Why is it causing an issue with the same path?
Glob expansion is performed by the shell. By default, shell is not involved in starting a new subprocess via Popen(). To this end you must pass the shell=True argument to it:
p1 = subprocess.Popen(cmd.split(), stdout=subprocess.PIPE, shell=True)
# ^^^^^^^^^^
The awk command, when put in to subprocess directly, works fine; E.g.
subprocess.Popen(["awk", ....], stdin=...., stdout=....) worked okay. But now I am getting an issue with the single quote. I tried
triple quoting the string and escaping the single-quote.
On the shell command line the single quotes in awk '{ print $6, $7, $9 }' are needed to make the string { print $6, $7, $9 } treated as a single argument (as well as to prevent the variable expansion). The single quotes are removed by the shell, and awk only sees the string { print $6, $7, $9 }. Since Popen() by default doesn't involve shell when executing the subprocess command and passes the arguments to the command verbatim, you don't need the single quotes:
subprocess.Popen(["awk", "{ print $6, $7, $9 }"], stdin=...., stdout=....)
I want to run a bash command from python shell.
my bash is:
grep -Po "(?<=<cite>).*?(?=</cite>)" /tmp/file1.txt | awk -F/ '{print $1}' | awk '!x[$0]++' > /tmp/file2.txt
what I tried is:
#!/usr/bin/python
import commands
commands.getoutput('grep ' + '-Po ' + '\"\(?<=<dev>\).*?\(?=</dev>\)\" ' + '/tmp/file.txt ' + '| ' + 'awk \'!x[$0]++\' ' + '> ' + '/tmp/file2.txt')
But I don't have any result.
Thank you
If you want to avoid splitting your arguments and worrying about pipes, you can use the shell=True option:
cmd = "grep -Po \"(?<=<dev>).*?(?=</dev>)\" /tmp/file.txt | awk -F/ '{print $1}' | awk '!x[$0]++' > file2.txt"
out = subprocess.check_output(cmd, shell=True)
This will run a subshell which will understands all your directives, including "|" for piping, ">" for redirection. If you do not do this, these symbols normally parsed by the shell will just be passed to grep program.
Otherwise, you have to create the pipes yourself. For example (untested code below):
grep_p = subprocess.Popen(["grep", "-Po", "(?<=<dev>).*?(?=</dev>)", "/tmp/file.txt"], stdout=subprocess.PIPE)
awk_p = subprocess.Popen(["awk", "-F/", "'{print $1}'"], stdin = grep_p.stdout)
file2_fh = open("file2.txt", "w")
awk_p_2 = subprocess.Popen(["awk", "!x[$0]++", stdout = file2_fh, stdin = awk_p.stdout)
awk_p_2.communicate()
However, you're missing the point of python if you are doing this. You should instead look into the re module: re.match, re.sub, re.search, though I'm not familiar enough with awk to translate your commands.
The recommend way to run system commands in python is to use the module subprocess.
import subprocess
a=['grep' ,'-Po', '"(?<=<dev>).*?(?=</dev>)"','/tmp/file.txt']
b=['awk', '-F/', '"{print $1}"']
c=["awk", '"!x[$0]++"']
p1 = subprocess.Popen(a,stdout=subprocess.PIPE)
p2 = subprocess.Popen(b,stdin=p1.stdout,stdout=subprocess.PIPE)
p3 = subprocess.Popen(c,stdin=p2.stdout,stdout=subprocess.PIPE)
p1.stdout.close()
p2.stdout.close()
out,err=p3.communicate()
print out
The point of creating pipes between each subprocess is for security and debugging reasons. Also it makes the code much clearer in terms, which process gets input and sends output to.
Let us write a simple function to easily deal with these messy pipes for us:
def subprocess_pipes (pipes, last_pipe_out = None):
import subprocess
from subprocess import PIPE
last_p = None
for cmd in pipes:
out_pipe = PIPE if not (cmd==pipes[-1] and last_pipe_out) else open(last_pipe_out, "w")
cmd = cmd if isinstance(cmd, list) else cmd.split(" ")
in_pipe = last_p.stdout if last_p else None
p = subprocess.Popen(cmd, stdout = out_pipe, stdin = in_pipe)
last_p = p
comm = last_p.communicate()
return comm
Then we run,
subprocess_pipes(("ps ax", "grep python"), last_pipe_out = "test.out.2")
The result is a "test.out.2" file with the contents of piping "ps ax" into "grep python".
In your case,
a = ["grep", "-Po", "(?<=<cite>).*?(?=</cite>)", "/tmp/file1.txt"]
b = ["awk", "-F/", "{print $1}"]
c = ["awk", "!x[$0]++"]
subprocess_pipes((a, b, c), last_pipe_out = "/tmp/file2.txt")
The commands module is obsolete now.
If you don't actually need the output of your command you can use
import os
exit_status = os.system("your-command")
Otherwise you can use
import suproccess
out, err = subprocess.Popen("your | commands", stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell = True).communicate()
Note: for your command you send stdout to file2.txt so I wouldn't expect to see anything in out you will however still see error messages on stderr which will go into err
you must use
import os
os.system(command)
I think what you are looking for is something like:
ubprocess.check_output(same as popen arguments, **kwargs) , use it the same way you would use a popen command , it should show you the output of the program that's being called.
For more details here is a link: http://freefilesdl.com/how-to-call-a-shell-command-from-python/
How could I run this code using subprocess module?
commands.getoutput('sudo blkid | grep 'uuid' | cut -d " " -f 1 | tr -d ":"')
I've tried this but it doesn't work at all
out_1 = subprocess.Popen(('sudo', 'blkid'), stdout=subprocess.PIPE)
out_2 = subprocess.Popen(('grep', 'uuid'), stdin=out_1.stdout, stdout=subprocess.PIPE)
out_3 = subprocess.Popen(('cut', '-d', '" "', '-f', '1'), stdin=out_2.stdout, stdout=subprocess.PIPE)
main_command = subprocess.check_output(('tr', '-d', '":"'), stdin=out_3.stdout)
main_command
Error: cut: the delimiter must be a single character
from subprocess import check_output, STDOUT
shell_command = '''sudo blkid | grep 'uuid' | cut -d " " -f 1 | tr -d ":"'''
output = check_output(shell_command, shell=True, stderr=STDOUT,
universal_newlines=True).rstrip('\n')
btw, it returns nothing on my system unless grep -i is used. In the latter case it returns devices. If it is your intent then you could use different command:
from subprocess import check_output
devices = check_output(['sudo', 'blkid', '-odevice']).split()
I'm trying not to use shell=True
It is ok to use shell=True if you control the command i.e., if you don't use user input to construct the command. Consider the shell command as a special language that allows you to express your intent concisely (like regex for string processing). It is more readable then several lines of code that do not use shell:
from subprocess import Popen, PIPE
blkid = Popen(['sudo', 'blkid'], stdout=PIPE)
grep = Popen(['grep', 'uuid'], stdin=blkid.stdout, stdout=PIPE)
blkid.stdout.close() # allow blkid to receive SIGPIPE if grep exits
cut = Popen(['cut', '-d', ' ', '-f', '1'], stdin=grep.stdout, stdout=PIPE)
grep.stdout.close()
tr = Popen(['tr', '-d', ':'], stdin=cut.stdout, stdout=PIPE,
universal_newlines=True)
cut.stdout.close()
output = tr.communicate()[0].rstrip('\n')
pipestatus = [cmd.wait() for cmd in [blkid, grep, cut, tr]]
Note: there are no quotes inside quotes here (no '" "', '":"'). Also unlike the previous command and commands.getoutput(), it doesn't capture stderr.
plumbum provides some syntax sugar:
from plumbum.cmd import sudo, grep, cut, tr
pipeline = sudo['blkid'] | grep['uuid'] | cut['-d', ' ', '-f', '1'] | tr['-d', ':']
output = pipeline().rstrip('\n') # execute
See How do I use subprocess.Popen to connect multiple processes by pipes?
pass your command as one string like this:
main_command = subprocess.check_output('tr -d ":"', stdin=out_3.stdout)
if you have multiple commands and if you want to execute one by one, pass them as list:
main_command = subprocess.check_output([comand1, command2, etc..], shell=True)