SSH to remote system and run python script and get output - python

I am playing with ssh and run the python scripts with help of these answers - Run local python script on remote server
For connecting the server using ssh, I am currently using the subprocess from python
#ssh command : ssh user#machine python < script.py - arg1 arg2
output = subprocess.Popen(f"ssh user#machine python3 < script.py", shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate()
Now, I get the output of the script as tuples of bytes of string, it is hard to decode the output information. The output having other information like warnings.
I tried decoding the output, but that is not looks great,
Is there any other possible ways to get the output back from ssh, for example, the script.py print a python dictionary or returns a json data, which can get it back in the same format and save in the output variable?
Thanks

There are a couple of ways to do this with subprocess depending on how up to date your Python is.
In general any byte string in Python you can turn into 'proper' text by decoding it. In your example, the command is returning a tuple which is the output of (stdout, stderr) so we need to pick one of those tuple elements with output[0] or output[1]:
>>> output = subprocess.Popen(f"ssh git#github.com", shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate()
>>> output[1].decode('utf-8')
"PTY allocation request failed on channel 0\r\nHi Chris! You've successfully authenticated, but GitHub does not provide shell access.\nConnection to github.com closed.\r\n"
You can tidy that up further by including the universal_newlines=True option.
Assuming you are Python 3.7 or later you can simplify your command so you capture a decoded output straight away:
>>> subprocess.run(['ssh', 'git#github.com'], capture_output=True, text=True).stderr
"PTY allocation request failed on channel 0\nHi Chris! You've successfully authenticated, but GitHub does not provide shell access.\nConnection to github.com closed.\n"
If you are struggling with the escaped newlines (\n) one approach would be to split them so you get an array of lines instead:
>>> output = subprocess.run(['ssh', 'git#github.com'], capture_output=True, text=True, universal_newlines=True).stderr.split("\n")
>>> output
['PTY allocation request failed on channel 0', "Hi Chris! You've successfully authenticated, but GitHub does not provide shell access.", 'Connection to github.com closed.', '']
>>> for line in output:
... print(line)
PTY allocation request failed on channel 0
Hi Chris! You've successfully authenticated, but GitHub does not provide shell access.
Connection to github.com closed.
print() will happily accept an array of terms to print, normally separated by a space. But you can override the separator:
print(*output, sep="\n")

Related

Python Subprocess with double quotes

I've got a Docker Service Log that takes in NiFi Actions and I want to capture only Log Entries that include "Successfully sent" and "Failed to process session" (and nothing more). They should be captured in a directory called "nifi_logs" in the present working directory. I need to do all of this using Python.
This is what I got so far:
docker_log = 'docker service logs nifi | grep -e "Successfully sent" -e "Failed to process session" >> $PWD/nifi_logs/nifi1.log'
subprocess.Popen(docker_log, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
I believe subprocess.Popen() is having difficulty with the double quotes used in the grep, as nifi1.log is completely empty. If the first command looks like the following:
docker_log = 'docker service logs nifi | grep session >> $PWD/nifi_logs/nifi1.log'
The Python code works just fine and captures all log entries with "session" in nifi1.log. As I explained above though, I need to grep for 2 kinds of Log Entires and both include multiple words, meaning I need to use quotes.
If I were to just run this command on the Terminal without Python:
docker service logs nifi | grep -e "Successfully sent" -e "Failed to process session" >> $PWD/nifi_logs/nifi1.log
The log generates the entries just fine, so I know the Docker Service command is written correctly.
I've tried switching the single and double quotes around, I've tried using \" instead of " within the single quotes ... nifi1.log continues to be empty.
I also tried using os.system() instead of subprocess.Popen(), but I run into the same problem (and I believe os.system() is somewhat deprecated).
Any ideas what I'd need to do to change what docker_log equals so that it will properly grep for the 2 search criteria? So you're aware: this question is not asking HOW I generate the log entries (I know what Docker Services I'm looking for, they generate properly), just what I need to do to get Python Subprocess Popen to accept a command with quotes in it.
Thank you for your assistance #David. Looking at your example, I found a solution: I removed stdout=subprocess.PIPE from subprocess.Popen and now it accepts double quotes just fine!
docker_log = 'docker service logs nifi | grep -e "Successfully sent" -e "Failed to process session" >> $PWD/nifi_logs/nifi1.log'
subprocess.Popen(docker_log, shell=True, stderr=subprocess.STDOUT)

Piping command with seemingly several separate outputs to file

I am using speedtest-cli to get my internet speed in a python script. I would like to run this command in shell via subprocess.Popen.
Here is the command in terminal:
`speedtest-cli --share > output.log`
speedtest-cli runs the test, whilst --share provides me with an additional link in the output, pointing to an image of the speedtest result. Here is the content of output.log:
Retrieving speedtest.net configuration...
Testing from M-net (xxx.xxx.xxx.xxx)...
Retrieving speedtest.net server list...
Selecting best server based on ping...
Hosted by XXXXXXXXXXXXXXXXXXXXXX [16.12 km]: 20.902 ms
Testing download speed................................................................................
Download: 48.32 Mbit/s
Testing upload speed......................................................................................................
Upload: 12.49 Mbit/s
Share results: http://www.speedtest.net/result/670483456.png
If I run the command in terminal, I get all the test results as well as the link in the target file as expected. I confirmed it is all stdout and not another channel by using this grep trick: command | grep .
I am trying to run it in Python as follows:
subprocess.Popen(['speedtest-cli', '--share', '>', 'output.log`],
stdout=subprocess.PIPE, shell=True)
...and I also tried putting the output into the file directly via python:
with open('output.log', 'w') as f:
Popen(['speedtest-cli', '--json', '--share'], stdout=f, shell=True)
Neither of these work. I get a nice file created with the latter approach, BUT the link is not included! (the last line in the output above).
Against all the warnings of deprecation and better safety with using subprocess module, I became desparate and tried os.system():
os.system('speedtest-cli --share > output.log')
Annoyingly, this works... the full output along with the link is captured in the file.
What is going on here? How do I get the link to be captured using Popen?
I'm using Python 3.5
When using shell=True, your argument to Popen needs to be a string, not a list:
subprocess.Popen('speedtest-cli --json --share > output.log',
stdout=subprocess.PIPE, shell=True)
Compare:
>>> subprocess.Popen('echo hello', shell=True)
>>> hello
And:
>>> subprocess.Popen(['echo', 'hello'], shell=True)
>>>
When you pass a list and you are using shell=True, only the the first item is relevant and the remainder is ignored.
If you want to collect the output yourself, consider subprocess.check_output:
>>> output = subprocess.check_output(['echo', 'hello'])
>>> output
b'hello\n'
Or:
>>> output = subprocess.check_output('echo hello', shell=True)
The check_output method works with both Python 2 and Python 3. In Python 3, you also have available the run method.

How to get output from netsh using python subprocess?

Can anyone please advise what am I doing wrong, that there is no output showed when executing netsh command on windows using python subprocess library?
Example:
p = subprocess.run('netsh dhcp show server', shell=True, stdout=subprocess.PIPE)
print(p.stdout.decode('utf-8'))
Output: empty string
When I execute some other command, etc. echo Hi, I get an output:
p = subprocess.run('echo Hi', shell=True, stdout=subprocess.PIPE)
print(p.stdout.decode('utf-8'))
My intention is to get list of our DHCP servers and parse the output.
Thanks!
Try to change the code page of the process to UTF-8 before executing sub-processes.
chcp 65001

Hiding shell output from subprocess.check_output()

I am retrieving information about my wireless connection by using
In [4]: subprocess.check_output('iwconfig')
eth0 no wireless extensions.
lo no wireless extensions.
Out[4]: 'wlan0 ...'
I get the string that I want, but I would like to clean the shell from the information about eth0 and lo (notice that these lines are not in the check_ouput return string, they are printed to the shell between ipython's In [4] and Out [4]).
My current guess is that these two lines are considered a warning and are not caught by check_output. Hoping they output to stderr I tried a few variations on
iwconfig > /dev/null 2>&1 # empties my wanted string
iwconfig 2 > /dev/null # throws an error
but without success so far.
How can I prevent the check_output from outputting anything to the shell?
If you want the stderr as your stdout:
subprocess.check_output('iwconfig', stderr=subprocess.STDOUT)
If what you want is just to wipe out your stderr:
import os
subprocess.check_output('iwconfig', stderr=open(os.devnull, 'w'))

Python 2.7 - Quotes added to command when using Popen to execute from variable

I'm trying to send a command through a socket from client to server to execute a command on the server and send the output back to me. Everything works fine if the command is one word with no options. However if I use an option such as netstat -an or dir c:\ then the command isn't recognized and from the output it looks like quotes are put around the command before being executed ('"netstat -an"' is not recognized as an internal or external command). I know they aren't saved in the variable this way because I printed it before being executed to error check. Please help. Here is what my code looks like:
commout = subprocess.Popen([data], stdout=subprocess.PIPE, shell=True)
(out, err) = commout.communicate()
Try make data an array of arguments (the first being the actual command).
For example:
commout = subprocess.Popen(['netstat', '-an'], stdout=subprocess.PIPE, shell=True)
The first element is a string representing the actual command (netstat), the next element is a string representing the first argument (-an).
To clarify, Popen(['echo', 'a', 'b'] is equivalent to echo a b on the command line, whereas Popen(['echo', 'a b'] would be equivalent to echo "a b" instead (ie. a single argument with a space between a and b.
If you pass a list, each item in it will be quoted separately. In this case, just pass a string:
subprocess.Popen(data, stdout=subprocess.PIPE, shell=True)

Categories