How to connect Paramiko stdin to console? [duplicate] - python

I'm trying to run an interactive command through paramiko. The cmd execution tries to prompt for a password but I do not know how to supply the password through paramiko's exec_command and the execution hangs. Is there a way to send values to the terminal if a cmd execution expects input interactively?
ssh = paramiko.SSHClient()
ssh.connect(server, username=username, password=password)
ssh_stdin, ssh_stdout, ssh_stderr = ssh.exec_command("psql -U factory -d factory -f /tmp/data.sql")
Does anyone know how this can addressed? Thank you.

The full paramiko distribution ships with a lot of good demos.
In the demos subdirectory, demo.py and interactive.py have full interactive TTY examples which would probably be overkill for your situation.
In your example above ssh_stdin acts like a standard Python file object, so ssh_stdin.write should work so long as the channel is still open.
I've never needed to write to stdin, but the docs suggest that a channel is closed as soon as a command exits, so using the standard stdin.write method to send a password up probably won't work. There are lower level paramiko commands on the channel itself that give you more control - see how the SSHClient.exec_command method is implemented for all the gory details.

I had the same problem trying to make an interactive ssh session using ssh, a fork of Paramiko.
I dug around and found this article:
Updated link (last version before the link generated a 404): http://web.archive.org/web/20170912043432/http://jessenoller.com/2009/02/05/ssh-programming-with-paramiko-completely-different/
To continue your example you could do
ssh_stdin, ssh_stdout, ssh_stderr = ssh.exec_command("psql -U factory -d factory -f /tmp/data.sql")
ssh_stdin.write('password\n')
ssh_stdin.flush()
output = ssh_stdout.read()
The article goes more in depth, describing a fully interactive shell around exec_command. I found this a lot easier to use than the examples in the source.
Original link: http://jessenoller.com/2009/02/05/ssh-programming-with-paramiko-completely-different/

You need Pexpect to get the best of both worlds (expect and ssh wrappers).

ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(server_IP,22,username, password)
stdin, stdout, stderr = ssh.exec_command('/Users/lteue/Downloads/uecontrol-CXC_173_6456-R32A01/uecontrol.sh -host localhost ')
alldata = ""
while not stdout.channel.exit_status_ready():
solo_line = ""
# Print stdout data when available
if stdout.channel.recv_ready():
# Retrieve the first 1024 bytes
solo_line = stdout.channel.recv(1024)
alldata += solo_line
if(cmp(solo_line,'uec> ') ==0 ): #Change Conditionals to your code here
if num_of_input == 0 :
data_buffer = ""
for cmd in commandList :
#print cmd
stdin.channel.send(cmd) # send input commmand 1
num_of_input += 1
if num_of_input == 1 :
stdin.channel.send('q \n') # send input commmand 2 , in my code is exit the interactive session, the connect will close.
num_of_input += 1
print alldata
ssh.close()
Why the stdout.read() will hang if use dierectly without checking stdout.channel.recv_ready(): in while stdout.channel.exit_status_ready():
For my case ,after run command on remote server , the session is waiting for user input , after input 'q' ,it will close the connection .
But before inputting 'q' , the stdout.read() will waiting for EOF,seems this methord does not works if buffer is larger .
I tried stdout.read(1) in while , it works
I tried stdout.readline() in while , it works also.
stdin, stdout, stderr = ssh.exec_command('/Users/lteue/Downloads/uecontrol')
stdout.read() will hang

I'm not familiar with paramiko, but this may work:
ssh_stdin.write('input value')
ssh_stdin.flush()
For information on stdin:
http://docs.python.org/library/sys.html?highlight=stdin#sys.stdin

Take a look at example and do in similar way
(sorce from http://jessenoller.com/2009/02/05/ssh-programming-with-paramiko-completely-different/):
ssh.connect('127.0.0.1', username='jesse',
password='lol')
stdin, stdout, stderr = ssh.exec_command(
"sudo dmesg")
stdin.write('lol\n')
stdin.flush()
data = stdout.read.splitlines()
for line in data:
if line.split(':')[0] == 'AirPort':
print line

You can use this method to send whatever confirmation message you want like "OK" or the password. This is my solution with an example:
def SpecialConfirmation(command, message, reply):
net_connect.config_mode() # To enter config mode
net_connect.remote_conn.sendall(str(command)+'\n' )
time.sleep(3)
output = net_connect.remote_conn.recv(65535).decode('utf-8')
ReplyAppend=''
if str(message) in output:
for i in range(0,(len(reply))):
ReplyAppend+=str(reply[i])+'\n'
net_connect.remote_conn.sendall(ReplyAppend)
output = net_connect.remote_conn.recv(65535).decode('utf-8')
print (output)
return output
CryptoPkiEnroll=['','','no','no','yes']
output=SpecialConfirmation ('crypto pki enroll TCA','Password' , CryptoPkiEnroll )
print (output)

Related

Run interactive python class within paramiko [duplicate]

I'm having a problem with a ShoreTel voice switch, and I'm trying to use Paramiko to jump into it and run a couple commands. What I believe the problem might be, is that the ShoreTel CLI gives different prompts than the standard Linux $. It would look like this:
server1$:stcli
Mitel>gotoshell
CLI> (This is where I need to enter 'hapi_debug=1')
Is Python still expecting that $, or am I missing something else?
I thought it might be a time thing, so I put those time.sleep(1) between commands. Still doesn't seem to be taking.
import paramiko
import time
keyfile = "****"
User = "***"
ip = "****"
command1 = "stcli"
command2 = "gotoshell"
command4 = "hapi_debug=1"
ssh = paramiko.SSHClient()
print('paramikoing...')
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname = ip, username = User, key_filename = keyfile)
print('giving er a go...')
ssh.invoke_shell()
stdin, stdout, stderr = ssh.exec_command(command1)
time.sleep(1)
stdin, stdout, stderr = ssh.exec_command(command2)
time.sleep(1)
stdin, stdout, stderr = ssh.exec_command(command4)
time.sleep(1)
print(stdout.read())
ssh.close()
print("complete")
What I would expect from the successful execution of this code, would be for the hapi_debug level to be 1. Which means that when I SSH into the thing, I would see those HAPI debugs populating. When I do, I do not see those debugs.
I assume that the gotoshell and hapi_debug=1 are not top-level commands, but subcommands of the stcli. In other words, the stcli is kind of a shell.
In that case, you need to write the commands that you want to execute in the subshell to its stdin:
stdin, stdout, stderr = ssh.exec_command('stcli')
stdin.write('gotoshell\n')
stdin.write('hapi_debug=1\n')
stdin.flush()
If you call stdout.read afterwards, it will wait until the command stcli finishes. What it never does. If you wanted to keep reading the output, you need to send a command that terminates the subshell (typically exit\n).
stdin.write('exit\n')
stdin.flush()
print(stdout.read())

I'm writing a paramiko common method to accept stdin and run command on other node [duplicate]

I'm having a problem with a ShoreTel voice switch, and I'm trying to use Paramiko to jump into it and run a couple commands. What I believe the problem might be, is that the ShoreTel CLI gives different prompts than the standard Linux $. It would look like this:
server1$:stcli
Mitel>gotoshell
CLI> (This is where I need to enter 'hapi_debug=1')
Is Python still expecting that $, or am I missing something else?
I thought it might be a time thing, so I put those time.sleep(1) between commands. Still doesn't seem to be taking.
import paramiko
import time
keyfile = "****"
User = "***"
ip = "****"
command1 = "stcli"
command2 = "gotoshell"
command4 = "hapi_debug=1"
ssh = paramiko.SSHClient()
print('paramikoing...')
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname = ip, username = User, key_filename = keyfile)
print('giving er a go...')
ssh.invoke_shell()
stdin, stdout, stderr = ssh.exec_command(command1)
time.sleep(1)
stdin, stdout, stderr = ssh.exec_command(command2)
time.sleep(1)
stdin, stdout, stderr = ssh.exec_command(command4)
time.sleep(1)
print(stdout.read())
ssh.close()
print("complete")
What I would expect from the successful execution of this code, would be for the hapi_debug level to be 1. Which means that when I SSH into the thing, I would see those HAPI debugs populating. When I do, I do not see those debugs.
I assume that the gotoshell and hapi_debug=1 are not top-level commands, but subcommands of the stcli. In other words, the stcli is kind of a shell.
In that case, you need to write the commands that you want to execute in the subshell to its stdin:
stdin, stdout, stderr = ssh.exec_command('stcli')
stdin.write('gotoshell\n')
stdin.write('hapi_debug=1\n')
stdin.flush()
If you call stdout.read afterwards, it will wait until the command stcli finishes. What it never does. If you wanted to keep reading the output, you need to send a command that terminates the subshell (typically exit\n).
stdin.write('exit\n')
stdin.flush()
print(stdout.read())

Python Paramiko - wait on more output from passed command before exiting

I have the following code
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.client.AutoAddPolicy())
privatekeyfile = 'PK_FILE_PATH'
username ="USERNAME"
mykey = paramiko.RSAKey.from_private_key_file(privatekeyfile)
client.connect(hostname= IP, username=username, pkey=mykey)
command = SERVER_COMMAND
stdin, stdout, stderr = client.exec_command(command)
while not stdout.channel.exit_status_ready():
if stdout.channel.recv_ready():
stdoutLines = stdout.readlines()
print stdoutLines
The command I'm executing takes about 10 seconds to run on the server. It initially returns some information (user profile and module version), then runs some code to check the status of some local server resources.
Paramiko is closing the connection after it receives the initial header information. I need it to wait for the full output of the serverside command to return. I've tried implementing tintin's solution here, with the same result
Any ideas?
add get_pty=True
This will wait until command execution completed.
stdin,stdout,stderr = self.ssh.exec_command(command,get_pty=True)
Paramiko is closing the connection after it receives the initial header information.
I do not think that's true. Try running a command like
command = 'echo first && sleep 60 && echo second'
while not stdout.channel.exit_status_ready():
if stdout.channel.recv_ready():
stdoutLines = stdout.readlines()
print stdoutLines
You will get both lines (also note that I'm printing the lines within the loop, so you can see lines).
It must be something with your command, like:
The command print the final output on stderr, not stdout.
The command does not print the final output when executed without TTY.
The command is a script that executes the a sub-command on a background, hence the the script finishes before the sub-command.

Paramiko can't get output for grep command

I am using a paramiko library.
Using shell = ssh.invoke_shell() and shell.send().
The command i am sending is cmd = """grep -i "Lost LLUS websocket" /var/log/debesys/cme.log\n""".
But i am not getting output .
I am using shell.receive to get the output but i always get blank.I have tested the command manually and it works fine.Does anybody have any idea on how to get the output?
I'm not sure if this works exactly the way you want it, but only going off the question title, here's how I would do it if you wanted to send the grep command itself:
def run_cmd(sshClient, command):
channel = sshClient.get_transport().open_session()
channel.get_pty()
channel.exec_command(command)
out = channel.makefile().read()
err = channel.makefile_stderr().read()
returncode = channel.recv_exit_status()
channel.close() # channel is closed, but not the client
return out, err, returncode
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(hostname, username = user, password = pw)
out, err, rc = run_cmd(client, 'grep -i "Lost LLUS websocket" /var/log/debesys/cme.log')
# Do whatever with the output
# Run any other commands
client.close()
Of course, there's a number of other options. You could scp the file to the local machine, open it in python, and search on it. You could even cat the file with an ssh call and use python search on the output.

Perform commands over ssh with Python

I'm writing a script to automate some command line commands in Python. At the moment, I'm doing calls like this:
cmd = "some unix command"
retcode = subprocess.call(cmd,shell=True)
However, I need to run some commands on a remote machine. Manually, I would log in using ssh and then run the commands. How would I automate this in Python? I need to log in with a (known) password to the remote machine, so I can't just use cmd = ssh user#remotehost, I'm wondering if there's a module I should be using?
I will refer you to paramiko
see this question
ssh = paramiko.SSHClient()
ssh.connect(server, username=username, password=password)
ssh_stdin, ssh_stdout, ssh_stderr = ssh.exec_command(cmd_to_execute)
If you are using ssh keys, do:
k = paramiko.RSAKey.from_private_key_file(keyfilename)
# OR k = paramiko.DSSKey.from_private_key_file(keyfilename)
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname=host, username=user, pkey=k)
Keep it simple. No libraries required.
import subprocess
# Python 2
subprocess.Popen("ssh {user}#{host} {cmd}".format(user=user, host=host, cmd='ls -l'), shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate()
# Python 3
subprocess.Popen(f"ssh {user}#{host} {cmd}", shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate()
Or you can just use commands.getstatusoutput:
commands.getstatusoutput("ssh machine 1 'your script'")
I used it extensively and it works great.
In Python 2.6+, use subprocess.check_output.
I found paramiko to be a bit too low-level, and Fabric not especially well-suited to being used as a library, so I put together my own library called spur that uses paramiko to implement a slightly nicer interface:
import spur
shell = spur.SshShell(hostname="localhost", username="bob", password="password1")
result = shell.run(["echo", "-n", "hello"])
print result.output # prints hello
If you need to run inside a shell:
shell.run(["sh", "-c", "echo -n hello"])
All have already stated (recommended) using paramiko and I am just sharing a python code (API one may say) that will allow you to execute multiple commands in one go.
to execute commands on different node use : Commands().run_cmd(host_ip, list_of_commands)
You will see one TODO, which I have kept to stop the execution if any of the commands fails to execute, I don't know how to do it. please share your knowledge
#!/usr/bin/python
import os
import sys
import select
import paramiko
import time
class Commands:
def __init__(self, retry_time=0):
self.retry_time = retry_time
pass
def run_cmd(self, host_ip, cmd_list):
i = 0
while True:
# print("Trying to connect to %s (%i/%i)" % (self.host, i, self.retry_time))
try:
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(host_ip)
break
except paramiko.AuthenticationException:
print("Authentication failed when connecting to %s" % host_ip)
sys.exit(1)
except:
print("Could not SSH to %s, waiting for it to start" % host_ip)
i += 1
time.sleep(2)
# If we could not connect within time limit
if i >= self.retry_time:
print("Could not connect to %s. Giving up" % host_ip)
sys.exit(1)
# After connection is successful
# Send the command
for command in cmd_list:
# print command
print "> " + command
# execute commands
stdin, stdout, stderr = ssh.exec_command(command)
# TODO() : if an error is thrown, stop further rules and revert back changes
# Wait for the command to terminate
while not stdout.channel.exit_status_ready():
# Only print data if there is data to read in the channel
if stdout.channel.recv_ready():
rl, wl, xl = select.select([ stdout.channel ], [ ], [ ], 0.0)
if len(rl) > 0:
tmp = stdout.channel.recv(1024)
output = tmp.decode()
print output
# Close SSH connection
ssh.close()
return
def main(args=None):
if args is None:
print "arguments expected"
else:
# args = {'<ip_address>', <list_of_commands>}
mytest = Commands()
mytest.run_cmd(host_ip=args[0], cmd_list=args[1])
return
if __name__ == "__main__":
main(sys.argv[1:])
paramiko finally worked for me after adding additional line, which is really important one (line 3):
import paramiko
p = paramiko.SSHClient()
p.set_missing_host_key_policy(paramiko.AutoAddPolicy()) # This script doesn't work for me unless this line is added!
p.connect("server", port=22, username="username", password="password")
stdin, stdout, stderr = p.exec_command("your command")
opt = stdout.readlines()
opt = "".join(opt)
print(opt)
Make sure that paramiko package is installed.
Original source of the solution: Source
The accepted answer didn't work for me, here's what I used instead:
import paramiko
import os
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
# ssh.load_system_host_keys()
ssh.load_host_keys(os.path.expanduser('~/.ssh/known_hosts'))
ssh.connect("d.d.d.d", username="user", password="pass", port=22222)
ssh_stdin, ssh_stdout, ssh_stderr = ssh.exec_command("ls -alrt")
exit_code = ssh_stdout.channel.recv_exit_status() # handles async exit error
for line in ssh_stdout:
print(line.strip())
total 44
-rw-r--r--. 1 root root 129 Dec 28 2013 .tcshrc
-rw-r--r--. 1 root root 100 Dec 28 2013 .cshrc
-rw-r--r--. 1 root root 176 Dec 28 2013 .bashrc
...
Alternatively, you can use sshpass:
import subprocess
cmd = """ sshpass -p "myPas$" ssh user#d.d.d.d -p 22222 'my command; exit' """
print( subprocess.getoutput(cmd) )
References:
https://github.com/onyxfish/relay/issues/11
https://stackoverflow.com/a/61016663/797495
Notes:
Just make sure to connect manually at least one time to the remote system via ssh (ssh root#ip) and accept the public key, this is many times the reason from not being able connect using paramiko or other automated ssh scripts.
I have used paramiko a bunch (nice) and pxssh (also nice). I would recommend either. They work a little differently but have a relatively large overlap in usage.
First: I'm surprised that no one has mentioned fabric yet.
Second: For exactly those requirements you describe I've implemented an own python module named jk_simpleexec. It's purpose: Making running commands easy.
Let me explain a little bit about it for you.
The 'executing a command locally' problem
My python module jk_simpleexec provides a function named runCmd(..) that can execute a shell (!) command locally or remotely. This is very simple. Here is an example for local execution of a command:
import jk_simpleexec
cmdResult = jk_simpleexec.runCmd(None, "cd / ; ls -la")
NOTE: Be aware that the returned data is trimmed automatically by default to remove excessive empty lines from STDOUT and STDERR. (Of course this behavior can be deactivated, but for the purpose you've in mind exactly that behavior is what you will want.)
The 'processing the result' problem
What you will receive is an object that contains the return code, STDOUT and STDERR. Therefore it's very easy to process the result.
And this is what you want to do as the command you execute might exist and is launched but might fail in doing what it is intended to do. In the most simple case where you're not interested in STDOUT and STDERR your code will likely look something like this:
cmdResult.raiseExceptionOnError("Something went wrong!", bDumpStatusOnError=True)
For debugging purposes you want to output the result to STDOUT at some time, so for this you can do just this:
cmdResult.dump()
If you would want to process STDOUT it's simple as well. Example:
for line in cmdResult.stdOutLines:
print(line)
The 'executing a command remotely' problem
Now of course we might want to execute this command remotely on another system. For this we can use the same function runCmd(..) in exactly the same way but we need to specify a fabric connection object first. This can be done like this:
from fabric import Connection
REMOTE_HOST = "myhost"
REMOTE_PORT = 22
REMOTE_LOGIN = "mylogin"
REMOTE_PASSWORD = "mypwd"
c = Connection(host=REMOTE_HOST, user=REMOTE_LOGIN, port=REMOTE_PORT, connect_kwargs={"password": REMOTE_PASSWORD})
cmdResult = jk_simpleexec.runCmd(c, "cd / ; ls -la")
# ... process the result stored in cmdResult ...
c.close()
Everything remains exactly the same, but this time we run this command on another host. This is intended: I wanted to have a uniform API where there are no modifications required in the software if you at some time decide to move from the local host to another host.
The password input problem
Now of course there is the password problem. This has been mentioned above by some users: We might want to ask the user executing this python code for a password.
For this problem I have created an own module quite some time ago. jk_pwdinput. The difference to regular password input is that jk_pwdinput will output some stars instead of just printing nothing. So for every password character you type you will see a star. This way it's more easy for you to enter a password.
Here is the code:
import jk_pwdinput
# ... define other 'constants' such as REMOTE_LOGIN, REMOTE_HOST ...
REMOTE_PASSWORD = jk_pwdinput.readpwd("Password for " + REMOTE_LOGIN + "#" + REMOTE_HOST + ": ")
(For completeness: If readpwd(..) returned None the user canceled the password input with Ctrl+C. In a real world scenario you might want to act on this appropriately.)
Full example
Here is a full example:
import jk_simpleexec
import jk_pwdinput
from fabric import Connection
REMOTE_HOST = "myhost"
REMOTE_PORT = 22
REMOTE_LOGIN = "mylogin"
REMOTE_PASSWORD = jk_pwdinput.readpwd("Password for " + REMOTE_LOGIN + "#" + REMOTE_HOST + ": ")
c = Connection(host=REMOTE_HOST, user=REMOTE_LOGIN, port=REMOTE_PORT, connect_kwargs={"password": REMOTE_PASSWORD})
cmdResult = jk_simpleexec.runCmd(
c = c,
command = "cd / ; ls -la"
)
cmdResult.raiseExceptionOnError("Something went wrong!", bDumpStatusOnError=True)
c.close()
Final notes
So we have the full set:
Executing a command,
executing that command remotely via the same API,
creating the connection in an easy and secure way with password input.
The code above solves the problem quite well for me (and hopefully for you as well). And everything is open source: Fabric is BSD-2-Clause, and my own modules are provided under Apache-2.
Modules used:
fabric : http://www.fabfile.org/
jk_pwdinput : https://github.com/jkpubsrc/python-module-jk-pwdinput
jk_simplexec : https://github.com/jkpubsrc/python-module-jk-simpleexec
Happy coding! ;-)
Works Perfectly...
import paramiko
import time
ssh = paramiko.SSHClient()
#ssh.load_system_host_keys()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('10.106.104.24', port=22, username='admin', password='')
time.sleep(5)
print('connected')
stdin, stdout, stderr = ssh.exec_command(" ")
def execute():
stdin.write('xcommand SystemUnit Boot Action: Restart\n')
print('success')
execute()
You can use any of these commands, this will help you to give a password also.
cmd = subprocess.run(["sshpass -p 'password' ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null root#domain.com ps | grep minicom"], shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
print(cmd.stdout)
OR
cmd = subprocess.getoutput("sshpass -p 'password' ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null root#domain.com ps | grep minicom")
print(cmd)
Have a look at spurplus, a wrapper we developed around spur that provides type annotations and some minor gimmicks (reconnecting SFTP, md5 etc.): https://pypi.org/project/spurplus/
Asking User to enter the command as per the device they are logging in.
The below code is validated by PEP8online.com.
import paramiko
import xlrd
import time
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
loc = ('/Users/harshgow/Documents/PYTHON_WORK/labcred.xlsx')
wo = xlrd.open_workbook(loc)
sheet = wo.sheet_by_index(0)
Host = sheet.cell_value(0, 1)
Port = int(sheet.cell_value(3, 1))
User = sheet.cell_value(1, 1)
Pass = sheet.cell_value(2, 1)
def details(Host, Port, User, Pass):
time.sleep(2)
ssh.connect(Host, Port, User, Pass)
print('connected to ip ', Host)
stdin, stdout, stderr = ssh.exec_command("")
x = input('Enter the command:')
stdin.write(x)
stdin.write('\n')
print('success')
details(Host, Port, User, Pass)
#Reading the Host,username,password,port from excel file
import paramiko
import xlrd
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
loc = ('/Users/harshgow/Documents/PYTHON_WORK/labcred.xlsx')
wo = xlrd.open_workbook(loc)
sheet = wo.sheet_by_index(0)
Host = sheet.cell_value(0,1)
Port = int(sheet.cell_value(3,1))
User = sheet.cell_value(1,1)
Pass = sheet.cell_value(2,1)
def details(Host,Port,User,Pass):
ssh.connect(Host, Port, User, Pass)
print('connected to ip ',Host)
stdin, stdout, stderr = ssh.exec_command("")
stdin.write('xcommand SystemUnit Boot Action: Restart\n')
print('success')
details(Host,Port,User,Pass)
The most modern approach is probably to use fabric. This module allows you to set up an SSH connection and then run commands and get their results over the connection object.
Here's a simple example:
from fabric import Connection
with Connection("your_hostname") as connection:
result = connection.run("uname -s", hide=True)
msg = "Ran {0.command!r} on {0.connection.host}, got stdout:\n{0.stdout}"
print(msg.format(result))
I wrote a simple class to run commands on remote over native ssh, using the subprocess module:
Usage
from ssh_utils import SshClient
client = SshClient(user='username', remote='remote_host', key='path/to/key.pem')
# run a list of commands
client.cmd(['mkdir ~/testdir', 'ls -la', 'echo done!'])
# copy files/dirs
client.scp('my_file.txt', '~/testdir')
Class source code
https://gist.github.com/mamaj/a7b378a5c969e3e32a9e4f9bceb0c5eb
import subprocess
from pathlib import Path
from typing import Union
class SshClient():
""" Perform commands and copy files on ssh using subprocess
and native ssh client (OpenSSH).
"""
def __init__(self,
user: str,
remote: str,
key_path: Union[str, Path]) -> None:
"""
Args:
user (str): username for the remote
remote (str): remote host IP/DNS
key_path (str or pathlib.Path): path to .pem file
"""
self.user = user
self.remote = remote
self.key_path = str(key_path)
def cmd(self,
cmds: list[str],
strict_host_key_checking=False) -> None:
"""runs commands consecutively, ensuring success of each
after calling the next command.
Args:
cmds (list[str]): list of commands to run.
strict_host_key_checking (bool, optional): Defaults to True.
"""
strict_host_key_checking = 'yes' if strict_host_key_checking \
else 'no'
cmd = ' && '.join(cmds)
subprocess.run(
[
'ssh',
'-i', self.key_path,
'-o', f'StrictHostKeyChecking={strict_host_key_checking}',
'-o', 'UserKnownHostsFile=/dev/null',
f'{self.user}#{self.remote}',
cmd
]
)
def scp(self, source: Union[str, Path], destination: Union[str, Path]):
"""Copies `srouce` file to remote `destination` using the
native `scp` command.
Args:
source (Union[str, Path]): Source file path.
destination (Union[str, Path]): Destination path on remote.
"""
subprocess.run(
[
'scp',
'-i', self.key_path,
str(source),
f'{self.user}#{self.remote}:{str(destination)}',
]
)
Below example, incase if you want user inputs for hostname,username,password and port no.
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
def details():
Host = input("Enter the Hostname: ")
Port = input("Enter the Port: ")
User = input("Enter the Username: ")
Pass = input("Enter the Password: ")
ssh.connect(Host, Port, User, Pass, timeout=2)
print('connected')
stdin, stdout, stderr = ssh.exec_command("")
stdin.write('xcommand SystemUnit Boot Action: Restart\n')
print('success')
details()

Categories