(Python) Connecting subprocess pipes/Running command through a jump box - python

So I want to use python to run a command on a server, but to do that I have to go through another box. I connect to the first box by running ssh through subprocess. However, Im unsure how to then get into the second box and run commands through that subprocess object.
My subprocess statement:
command = "ssh servername"
sshConnection = subprocess.Popen(command.split(),stderr=subprocess.PIPE
,stdout=subprocess.PIPE,stdin=subprocess.PIPE)
The only methods I can think of:
Somehow connect the pipes of two subprocess commands?
Establish a SSH tunnel first then run commands through that?
Any other ideas? Am I approaching this wrong?

If, by "go through another box", you mean "initiate an ssh connection from the jumpbox to the server", try:
command = "ssh jumpbox ssh servername /bin/ls -l"
sshConnection = subprocess.Popen(command.split(),stderr=subprocess.PIPE
,stdout=subprocess.PIPE,stdin=subprocess.PIPE)

How about:
cmd = "mycmd myarg1 myarg2"
ssh_command = "ssh servername %s" % cmd
sshConnection = subprocess.Popen(ssh_command.split(),stderr=subprocess.PIPE
,stdout=subprocess.PIPE,stdin=subprocess.PIPE)
Of course you need to also shell-escape the command, if it contains special characters.
A slicker alternative, I personally like, is using plumbum. You can nest remote (ssh) commands.

As you are doing a lot of ssh with subprocess I suggest you should use Fabric

Related

Become root user and execute command after performing ssh [duplicate]

I use a friends server that allows only one user to be logged from SSH, so normally I just log in as that user and then do su -l myuser to change accounts. I wanted to automate some boring stuff using Python, but I ran into problems with that. Apparently Paramiko module that I tried first invokes a single shell for every command, so that was out of the question. Later I tried using invoke_shell() to overcome that, but it still failed (I assume it's because changing user changes shell as well).
After that I found about Fabric module, but best I could do is open SSH shell with a proper user logged in, but without option to run any commands from code.
Is there any way to accomplish that? Final goal would probably look something like this:
ssh.login(temp_user, pass)
ssh.command("su -l myuser")
expect("Password: ", ssh.send("mypass\n")
ssh.command("somescript.sh > datadump.txt")
Using sudo is impossible, as well as adding passwordless login.
As suggested here is the code that I tried with Paramiko:
import paramiko
host = "hostip"
user = "user"
user_to_log = "myuser"
password = "pass"
password_to_log = "mypass"
login_command = "su -l " + user_to_log
ssh = paramiko.SSHClient()
ssh.load_system_host_keys()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostip, username=user,
password=password)
transport = ssh.get_transport()
session = transport.open_session()
session.set_combine_stderr(True)
session.get_pty()
session.exec_command("su -l " + user_to_log)
stdin = session.makefile('wb', -1)
stdin.write(password_to_log +'\n')
stdin.flush()
session.exec_command("whoami")
stdout = session.makefile('rb', -1)
for line in stdout.read().splitlines():
print('host: %s: %s' % (host, line))
su -c command won't work either, since server system doesn't support that option.
General disclaimers first (to others who stumble upon this question):
Using su is not the right solution. su is a tool intended for an interactive use, not for an automation. The correct solution is to login with the correct account directly.
Or at at least use a password-less sudo.
Or you can create a root-owned script with setuid right.
See also Allowing automatic command execution as root on Linux using SSH.
If you are stuck with su, on most systems you can use -c switch to su to specify a command:
su -c "whoami" user
See also How to run sudo with Paramiko? (Python)
If none of the above is feasible (and you really tried hard to make the admin enable some of the options above):
As the last resort option, you can write the command to a standard input of the su, the same way you already write a password (another thing not to do):
stdin, stdout, stderr = session.exec_command("su -l " + user_to_log)
stdin.write(password_to_log + '\n')
stdin.flush()
command = 'whoami'
stdin.write(command + '\n')
stdin.flush()
(also note that it's redundant to call makefile, as exec_command already returns that)
See Execute (sub)commands in secondary shell/command on SSH server in Python Paramiko.
Note that your question is not about which SSH client library to use. It does not matter if you use Paramiko or other. This all is actually a generic SSH/Linux/shell question.

Executing command using "su -l" in SSH using Python

I use a friends server that allows only one user to be logged from SSH, so normally I just log in as that user and then do su -l myuser to change accounts. I wanted to automate some boring stuff using Python, but I ran into problems with that. Apparently Paramiko module that I tried first invokes a single shell for every command, so that was out of the question. Later I tried using invoke_shell() to overcome that, but it still failed (I assume it's because changing user changes shell as well).
After that I found about Fabric module, but best I could do is open SSH shell with a proper user logged in, but without option to run any commands from code.
Is there any way to accomplish that? Final goal would probably look something like this:
ssh.login(temp_user, pass)
ssh.command("su -l myuser")
expect("Password: ", ssh.send("mypass\n")
ssh.command("somescript.sh > datadump.txt")
Using sudo is impossible, as well as adding passwordless login.
As suggested here is the code that I tried with Paramiko:
import paramiko
host = "hostip"
user = "user"
user_to_log = "myuser"
password = "pass"
password_to_log = "mypass"
login_command = "su -l " + user_to_log
ssh = paramiko.SSHClient()
ssh.load_system_host_keys()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostip, username=user,
password=password)
transport = ssh.get_transport()
session = transport.open_session()
session.set_combine_stderr(True)
session.get_pty()
session.exec_command("su -l " + user_to_log)
stdin = session.makefile('wb', -1)
stdin.write(password_to_log +'\n')
stdin.flush()
session.exec_command("whoami")
stdout = session.makefile('rb', -1)
for line in stdout.read().splitlines():
print('host: %s: %s' % (host, line))
su -c command won't work either, since server system doesn't support that option.
General disclaimers first (to others who stumble upon this question):
Using su is not the right solution. su is a tool intended for an interactive use, not for an automation. The correct solution is to login with the correct account directly.
Or at at least use a password-less sudo.
Or you can create a root-owned script with setuid right.
See also Allowing automatic command execution as root on Linux using SSH.
If you are stuck with su, on most systems you can use -c switch to su to specify a command:
su -c "whoami" user
See also How to run sudo with Paramiko? (Python)
If none of the above is feasible (and you really tried hard to make the admin enable some of the options above):
As the last resort option, you can write the command to a standard input of the su, the same way you already write a password (another thing not to do):
stdin, stdout, stderr = session.exec_command("su -l " + user_to_log)
stdin.write(password_to_log + '\n')
stdin.flush()
command = 'whoami'
stdin.write(command + '\n')
stdin.flush()
(also note that it's redundant to call makefile, as exec_command already returns that)
See Execute (sub)commands in secondary shell/command on SSH server in Python Paramiko.
Note that your question is not about which SSH client library to use. It does not matter if you use Paramiko or other. This all is actually a generic SSH/Linux/shell question.

How to pass commands to an SSH from subprocess in Python

I have used the subprocess module in Python 2.7.6 to establish an SSH. I realise that this is not recommended, but I am unable to install other Python SSH libraries such as paramiko and fabric.
I was just wondering if someone wouldn't mind just telling me how I'd now go about
sshProcess = subprocess.call(['ssh', '-t', '<REMOTE>', 'ssh', '<REMOTE>'])
I want to carry out commands in REMOTE with the subprocess approach. Is there any way to do this? Unfortunately, REMOTE is protected by a password which the user manually enters. If it helps, I'm running the Windows 10 Bash shell.
Any help is appreciated.
Running a remote command is as simple as putting it on the command line. (This is distinguishable to the SSH server at a protocol level from feeding it on stdin, but the protocol in question is built for programmatic use, vs built for human use -- as the latter was the design intent behind the interactive-shell model).
By the way, if you want to run multiple commands via distinct SSH invocations over a single connection after authenticating only once, I'd strongly suggest using Paramiko for this, but you can do it with OpenSSH command-line tools by using SSH multiplexing support.
Let's say you have an array representing your remote command:
myCommand = [ 'ls', '-l', '/tmp/my directory name with spaces' ]
To get that into a string (in a way that honors the spaces and can't let a maliciously-selected name run arbitrary commands on the remote server), you'd use:
myCommandStr = ' '.join(pipes.quote(n) for n in myCommand)
Now, you have something you can pass as a command line argument to ssh:
subprocess.call(['ssh', '-t', hostname, myCommandStr])
However, let's say you want to nest this. You can just repeat the process:
myCommand = [ 'ssh', '-t', hostname1, myCommandStr ]
myCommandStr = ' '.join(pipes.quote(n) for n in myCommand)
subprocess.call(['ssh', '-t', hostname2, myCommandStr])
Because we aren't redirecting stdin or stdout, they should still be pointed at the terminal from which your Python program was started, so SSH should be able to execute its password prompts directly.
That said, specifically for ssh'ing through an interim system, you don't need to go through this much trouble: You can tell ssh to do that work for you with the ProxyJump option:
myCommand = [ 'ls', '-l', '/tmp/my directory name with spaces' ]
myCommandStr = ' '.join(pipes.quote(n) for n in myCommand)
subprocess.call(['ssh', '-o', 'ProxyJump=%s' % hostname1, hostname2, myCommandStr])
From your comment, you say you can connect. So after that, to interact over ssh using subprocess you will need something like:
ssh = subprocess.Popen(['ssh', <remote client>],
stdin=subprocess.PIPE,
stdout = subprocess.PIPE)
back = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
print error
else:
print back
then to send commands, like list directory, something like:
ssh.stdin.write("ls\n")

Pseudo terminal will not be allocated error - ssh - sudo - websocket - subprocess

I basically want to create a web page through which a unix terminal at the server side can be reached and commands can be sent to and their results can be received from the terminal.
For this, I have a WSGIServer. When a connection is opened, I execute the following:
def opened(self):
self.p = Popen(["bash", "-i"], bufsize=1, stdin=PIPE, stdout=PIPE, stderr=STDOUT)
self.p.stdout = Unbuffered(self.p.stdout)
self.t = Thread(target=self.listen_stdout)
self.t.daemon = True
self.t.start()
When a message comes to the server from the client, It is handled in the following function, which only redirects the coming message to the stdin of subprocess p which is an interactive bash:
def received_message(self, message):
print(message.data, file=self.p.stdin)
Then outputs of the bash is read in the following function within a separate thread t. It only sends the outputs to the client.
def listen_stdout(self):
while True:
c = self.p.stdout.read(1)
self.send(c)
In such a system, I am able to send any command(ls, cd, mkdir etc.) to the bash working at the server side and receive their outputs. However, when I try to run ssh xxx#xxx, the error pseudo-terminal will not be allocated because stdin is not a terminal is shown.
Also, in a similar way, when I run sudo ..., the prompt for password is not sent to the client somehow, but it appears on the terminal of the server script, instead.
I am aware of expect; however, only for such sudo and ssh usage, I do not want to mess my code up with expect. Instead, I am looking for a general solution that can fake sudo and ssh and redirect prompt's to the client.
Is there any way to solve this? Ideas are appreciated, thanks.
I found the solution. What I need was creating a pseudo-terminal. And, at the slave side of the tty, make a setsid() call to make this process a new session and run commands on it.
Details are here:
http://www.rkoucha.fr/tech_corner/pty_pdip.html

How to send a stream with python fabric

I need to send the stdout stream of a program across a network, to the stdin of another program, running on another host.
This can be easily accomplished using ssh:
program 1 | ssh host 'program 2'
It's trivial to call this using subprocess:
subprocess.call("program 1 | ssh host 'program 2'", shell=True)
However, since I need to run many other commands on the remote host, I'm using fabric.
Sending files with fabric is easy enough, but I can't find any documentation on sending streams. I know fabric uses the paramiko ssh library, so I could use it's channel but there seems to be no documentation for accessing the channel from fabric.
I ended up digging through the fabric source code (fabric.operations._execute) and came up this:
from fabric.state import default_channel
import subprocess
def remote_pipe(local_command, remote_command, buf_size=1024):
'''executes a local command and a remove command (with fabric), and
sends the local's stdout to the remote's stdin.
based on fabric.operations._execute'''
local_p= subprocess.Popen(local_command, shell=True, stdout=subprocess.PIPE)
channel= default_channel() #fabric function
channel.set_combine_stderr(True)
channel.exec_command( remote_command )
read_bytes= local_p.stdout.read(buf_size)
while read_bytes:
channel.sendall(read_bytes)
read_bytes= local_p.stdout.read(buf_size)
local_ret= local_p.wait()
channel.shutdown_write()
received= channel.recv(640*1024) #ought to be enough for everyone
remote_ret= channel.recv_exit_status()
if local_ret!=0 or remote_ret!=0:
raise Exception("remote_pipe failed: "+received)

Categories