Run command using rsh on a remote computer - python

I am trying to run a python script, and I need to Rsh a command from the script, the command I want to run is : df -Pk|grep "sd\|md"|gawk '{print $2}'
and I do it as -
cmd2='df -Pk|grep \\\"sd\|md\\\"|gawk \'{print $2}\''
process = subprocess.Popen(['rsh',ip,cmd2],stdout=subprocess.PIPE)
output = process.communicate()[0]
However when I do run the script,I get nothing in output.
I am new to python and as far as I know, its a problem with the escape characters.
Any help would be great.
Note:
I have to use Rsh only and cannot use ssh
Thanks

Enclose the command, with proper shell quoting, in triple quote marks:
"""df -Pk|grep "sd\|md"|gawk '{print $2}'"""
See also the Python tutorial's bit on strings.

Related

Python Subprocess with double quotes

I've got a Docker Service Log that takes in NiFi Actions and I want to capture only Log Entries that include "Successfully sent" and "Failed to process session" (and nothing more). They should be captured in a directory called "nifi_logs" in the present working directory. I need to do all of this using Python.
This is what I got so far:
docker_log = 'docker service logs nifi | grep -e "Successfully sent" -e "Failed to process session" >> $PWD/nifi_logs/nifi1.log'
subprocess.Popen(docker_log, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
I believe subprocess.Popen() is having difficulty with the double quotes used in the grep, as nifi1.log is completely empty. If the first command looks like the following:
docker_log = 'docker service logs nifi | grep session >> $PWD/nifi_logs/nifi1.log'
The Python code works just fine and captures all log entries with "session" in nifi1.log. As I explained above though, I need to grep for 2 kinds of Log Entires and both include multiple words, meaning I need to use quotes.
If I were to just run this command on the Terminal without Python:
docker service logs nifi | grep -e "Successfully sent" -e "Failed to process session" >> $PWD/nifi_logs/nifi1.log
The log generates the entries just fine, so I know the Docker Service command is written correctly.
I've tried switching the single and double quotes around, I've tried using \" instead of " within the single quotes ... nifi1.log continues to be empty.
I also tried using os.system() instead of subprocess.Popen(), but I run into the same problem (and I believe os.system() is somewhat deprecated).
Any ideas what I'd need to do to change what docker_log equals so that it will properly grep for the 2 search criteria? So you're aware: this question is not asking HOW I generate the log entries (I know what Docker Services I'm looking for, they generate properly), just what I need to do to get Python Subprocess Popen to accept a command with quotes in it.
Thank you for your assistance #David. Looking at your example, I found a solution: I removed stdout=subprocess.PIPE from subprocess.Popen and now it accepts double quotes just fine!
docker_log = 'docker service logs nifi | grep -e "Successfully sent" -e "Failed to process session" >> $PWD/nifi_logs/nifi1.log'
subprocess.Popen(docker_log, shell=True, stderr=subprocess.STDOUT)

How to replicate bash's escaping on commands passed as arguments to other commands

I'm passing the result of the execution of a command to python as input, like so:
$ python parse_ips.py "$(kubectl get configmap ...)"
This works fine when executing from the command line, however I'm now trying to edit the file using PyCharm. Therefore I need the escaped version of the result of this command which I can paste into PyCharm's debug configuration, as I can't execute the command in real-time like I can do on the command line.
However, I am struggling to find a way to replicate the escaping bash does behind the scenes, so I can use the result as an argument within the PyCharm configuration. Running the above kubectl command results in a multi-line string which includes spaces and quotes. When I paste this into PyCharm it just interprets it as multiple arguments. I'm looking for the escaped result, which I could paste directly into the command line, or into PyCharm's debug configuration, to achieve the same result with a fixed parameter for testing.
Any help would be greatly appreciated!
Edit: To clarify, I mean on the command line the result of the $(kubectl ...) command is passed into the python program as a single command line argument when it is surrounded by quotes ("$(kubectl ...)"). So in the python program, you can access sys.argv[1] and it will contain the entire execution output of $(kubectl get configmap ...). However, if I execute that command myself on the command line, the result is a multi-line string.
If I then copy the result of that into PyCharm (or even on the command line again), it is interpreted as many command line arguments. E.g. it would look something like this:
$ python parse_ips.py apiVersion: v1
data:
item1: ifconfig-push 127.0.0.0 255.255.0.0
item2: ifconfig-push 127.0.0.1 255.255.0.0
item3: ifconfig-push 127.0.0.2 255.255.0.0
...
And so on. This obviously doesn't work in the same way as it did before. So I am unable to test my program without making the kubectl call from the command line each time. I was looking to replicate what "$(kubectl ...)" gets converted into so it is able to pass the entire output as a single command line entry.
I am struggling to find a way to replicate the escaping bash does behind the scenes
Typically use printf "%q" to escape stuff.
printf "%q" "$(kubectl get configmap ....)"
This is printf as the bash builtin command. It differs from coreutils printf, and newest ones also support %q with different quoting style:
/usr/bin/printf "%q" "$(kubectl get configmap ....)"
Modern bash also has quoting expansion:
var="$(kubectl get configmap ....)"
echo "${var#Q}"
And there is also the quoting style outputted by set -x.
I would suggest to use a file:
kubectl get configmap ... > /tmp/tempfile
python parse_ips.py "$(cat /tmp/tempfile)"
With xclip you can copy command output straight to the X server clipboard, which is handy:
printf "%q" "$(kubectl get configmap ...)" | xclip -selection clipboard
# then in another window:
python parse_ips.py <right mouse click><select paste>

Fail to run a bash command on python with subprocess and sed

My goal is to execute the following bash command in Python and store its output:
echo 'sudo ./run_script.sh -dates \\{\\'2017-11-16\\',\\'2017-11-29\\'\\}'|sed 's;\\\\;\\;'
When I run this command in bash, the output is: sudo ./run_script.sh -dates \{\'2019-10-05\',\'2019-10-04\'\}
My initial idea was to replace the double backslash by a single backslash in Python. As ridiculous as it seems, I couldn't do it in Python (only when using print() the output is as I would like but I can't store the output of print() and str() doesn't convert \ to . So I decided to do it in bash.
import subprocess
t= 'some \\ here'
cmd = "echo \'"+ t+"\'|sed 's;\\\\;\\;'"
ps = subprocess.run(cmd,shell=True,stdout=subprocess.PIPE,stderr=subprocess.STDOUT)
ps.stdout
Out[6]: b"sed: -e expression #1, char 7: unterminated `s' command\n"
Running Python 3.6.8 on Ubuntu 18
Try using subprocess.check_output instead. You're also forgetting an extra backslash for every backslash in your command.
import subprocess
command = "echo 'some \\\\here'|sed 's;\\\\\\\\;\\\\;'"
output = subprocess.check_output(command, shell=True).decode()
print(output) # prints your expect result "some \here"
After re-reading your question I kinda understood what you wanted.
a = r'some \here'
print(a) #some \here
Again, raw string literals...

using apostrophe with python's subprocess

seems like I can't use apostrophe, the command fails with no informative error.
I'm trying to execute the following:
secretLsCmd = subprocess.Popen(('docker', 'secret', 'ls') , stdout=subprocess.PIPE)
oneWhitespaceCmd = subprocess.Popen(('tr', '-s','" "') , stdout=subprocess.PIPE, stdin=secretLsCmd.stdout)
onlySecretsCmd = subprocess.check_output(('cut', "-d' '", '-f2') , stdin=oneWhitespaceCmd.stdout)
in a normal Linux terminal, it would execute the following command:
docker secret ls | tr -s " " | cut -d' ' -f2
Running this command in CLI works fine, but once I put it in python it isn't working. The 2 first commands in the pipe are working fine (i have checked), the last command is not working, exiting with error code 1... I'm almost 100% sure it is the -d' ' , but how can I fix that? any idea?
This line:
oneWhitespaceCmd = subprocess.Popen(('tr', '-s','" "'), ...)
is actually running:
tr -s '" "'
so you want to lose the extra double quotes there: Python will quote any arguments that it need to quote for the shell.
This command:
onlySecretsCmd = subprocess.check_output(('cut', "-d' '", '-f2'), ...)
is equivalent to the shell command:
cut '-d'"'"' '"'"'' -f2
so again, you probably just want to lose the quotes round the whitespace:
onlySecretsCmd = subprocess.check_output(('cut', "-d ", '-f2'), ...)
and leave it to Python to insert quotes where required. This will actually run (what should be equivalent to what you want though not identical):
cut '-d ' -f2
I used shlex.quote() to create a shell equivalent to the commands you are running, though in practice unless you tell subprocess to invoke a shell it will just be executing the equivalent of the command bash would execute after parsing all the escape quote marks. So internally the escaping isn't happening but the quotes to distinguish the arguments aren't needed either.

Avoid subprocess.Popen auto escaping my backslashes in grep

I'm trying to write an svn pre-commit hook in python. Part of this involves checking the diff file to see if there are any actual file changes (as opposed to just property changes).
I have a working grep command which I can execute fine on the shell
grep "^\(Added: \|Modified: \|Deleted: \)" diff filename | grep -v 'svn:'
However when I put it through subprocess.POpen it escapes all my backslashes, which knackers the regexp.
Executing command: ['grep', '"^\\Added: \\|Modified: \\|Deleted: \\)", ...]
How do I avoid this?
NB: I'm aware that I can pipe results between subprocesses and I can do the two greps that way. I need help getting the first one working first though :/
NB2: I also tried using filterdiff --clean instead and couldn't get it to work. Searching for Added, Modified or Deleted lines, removing those with 'svn:' in and checking I had some results seemed to work though.
Python code:
command = ['grep', '"^\(Added: \|Modified: \|Deleted: \)"', filename]
sys.stdout.write('Executing command: %s\n' % (command))
p = subprocess.Popen(command,
stdin = subprocess.PIPE
stdout = subprocess.PIPE
stderr = subprocess.STDOUT
shell = True)
data = p.stdout.read()
if len(data) == 0:
sys.stdout.write("Diff does not contain any file modifications./n")
exit(0)
You need to consider what you want grep to see in its command line arguments.
The first argument needs to be the literal string "^\(Added: \|Modified: \|Deleted: \)", so that means that it shouldn't include the double quotes but should include the backslashes.
The way to express this kind of string is to use Python raw strings:
command = ['grep', r'^\(Added: \|Modified: \|Deleted: \)', filename]
A good way to check what you're actually running is to replace grep by echo so you can at least see what you're passing to the command.

Categories