Python subprocess.popen over network - python

Python subprocess.popen is easy as pie on local machine, but is it possible to call it over a network?
Example, say I have 3 PCs, one is called workstation-pc, the others are called node1-pc and node2-pc...
Is it possible to call a process on, say, node1-pc, from workstation-pc, preferably without having to run special server software on node1-pc?
In any case, many thanks for any response!
Gilles
EDIT
Forgot to mention that I am using Python 3

I would recommend the use of either:
execnet
fabric
Example with Fabric:
from fabric.api import env, run
env.hosts = ['host1', 'host2']
def mytask():
run('ls /var/www')

Fabric is probably the way to go, however, there is nothing to stop you executing remote commands via ssh in *nix environments.
import subprocess
p = subprocess.Popen(['ssh', 'node1-pc', 'ls', '-ltr', '/etc'], stdout=subprocess.PIPE)
out, err = p.communicate()
out will contain the stdout of the remote process.

Related

How to pass commands to an SSH from subprocess in Python

I have used the subprocess module in Python 2.7.6 to establish an SSH. I realise that this is not recommended, but I am unable to install other Python SSH libraries such as paramiko and fabric.
I was just wondering if someone wouldn't mind just telling me how I'd now go about
sshProcess = subprocess.call(['ssh', '-t', '<REMOTE>', 'ssh', '<REMOTE>'])
I want to carry out commands in REMOTE with the subprocess approach. Is there any way to do this? Unfortunately, REMOTE is protected by a password which the user manually enters. If it helps, I'm running the Windows 10 Bash shell.
Any help is appreciated.
Running a remote command is as simple as putting it on the command line. (This is distinguishable to the SSH server at a protocol level from feeding it on stdin, but the protocol in question is built for programmatic use, vs built for human use -- as the latter was the design intent behind the interactive-shell model).
By the way, if you want to run multiple commands via distinct SSH invocations over a single connection after authenticating only once, I'd strongly suggest using Paramiko for this, but you can do it with OpenSSH command-line tools by using SSH multiplexing support.
Let's say you have an array representing your remote command:
myCommand = [ 'ls', '-l', '/tmp/my directory name with spaces' ]
To get that into a string (in a way that honors the spaces and can't let a maliciously-selected name run arbitrary commands on the remote server), you'd use:
myCommandStr = ' '.join(pipes.quote(n) for n in myCommand)
Now, you have something you can pass as a command line argument to ssh:
subprocess.call(['ssh', '-t', hostname, myCommandStr])
However, let's say you want to nest this. You can just repeat the process:
myCommand = [ 'ssh', '-t', hostname1, myCommandStr ]
myCommandStr = ' '.join(pipes.quote(n) for n in myCommand)
subprocess.call(['ssh', '-t', hostname2, myCommandStr])
Because we aren't redirecting stdin or stdout, they should still be pointed at the terminal from which your Python program was started, so SSH should be able to execute its password prompts directly.
That said, specifically for ssh'ing through an interim system, you don't need to go through this much trouble: You can tell ssh to do that work for you with the ProxyJump option:
myCommand = [ 'ls', '-l', '/tmp/my directory name with spaces' ]
myCommandStr = ' '.join(pipes.quote(n) for n in myCommand)
subprocess.call(['ssh', '-o', 'ProxyJump=%s' % hostname1, hostname2, myCommandStr])
From your comment, you say you can connect. So after that, to interact over ssh using subprocess you will need something like:
ssh = subprocess.Popen(['ssh', <remote client>],
stdin=subprocess.PIPE,
stdout = subprocess.PIPE)
back = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
print error
else:
print back
then to send commands, like list directory, something like:
ssh.stdin.write("ls\n")

How to interact with a Terminal in python

I'm working on a small script. The script should open 3 terminals and interact with this terminals independently.
I am pretty understand that subprocess is the best way to do that. What I've done so far:
# /usr/bin/env python
import subprocess
term1 = subprocess.Popen(["open", "-a", "Terminal"], stdin=subprocess.PIPE, stdout=subprocess.PIPE)
term1.communicate(input="pwd")
My problem is I cannot interact with a new terminal. this part term1.communicate(input="pwd") is not working. I cannot send a command to a new Terminal. I also tried term1.communicate(input="pwd\n") but nothing happens
Do you any ideas how can I do that?
P.S. I am using Mac OS.
You can run both commands concurrently without opening terminals.
import subprocess
process1 = subprocess.Popen(["ls", "-l"])
process2 = subprocess.Popen(["ls", "-l"])
If you run that code you will see that the directory is listed twice, interleaved together. You can expand this for your specific needs:
tcprelay1 = subprocess.Popen(["tcprelay", "telnet"])
tcprelay2 = subprocess.Popen(["tcprelay", "--portoffset [arg1] [arg2]")

Persistent ssh session in Python using Popen

I am creating a movie controller (Pause/Stop...) using python where I ssh into a remote computer, and issue commands into a named pipe like so
echo -n q > ~/pipes/pipename
I know this works if I ssh via the terminal and do it myself, so there is no problem with the setup of the named pipe redirection. My problem is that setting up an ssh session takes time (1-3 seconds), whereas I want the pause command to be instantaneous. Therefore, I thought of setting up a persistent pipe like so:
controller = subprocess.Popen ( "ssh -T -x <hostname>", shell = True, close_fds = True, stdin=subprocess.PIPE, stderr=subprocess.PIPE, stdout=subprocess.PIPE )
Then issue commands to it like so
controller.stdin.write ( 'echo -n q > ~/pipes/pipename' )
I think the problem is that ssh is interactive so it expects a carriage return. This is where my problems begin, as nearly everyone who has asked this question has been told to use an existing module:
Vivek's answer
Chakib's Answer
shx2's Answer
Crafty Thumber's Answer
Artyom's Answer
Jon W's Answer
Which is fine, but I am so close. I just need to know how to include the carriage return, otherwise, I have to go learn all these other modules, which mind you is not trivial (for example, right now I can't figure out how pexpect uses either my /etc/hosts file or my ssh keyless authentications).
To add a newline to the command, you will need to add a newline to the string:
controller.stdin.write('\n')
You may also need to flush the pipe:
controller.stdin.flush()
And of course the controller has to be ready to receive new data, or you could block forever trying to send it data. (And if the reason it's not ready is that it's blocking forever waiting for you to read from its stdout, which is possible on some platforms, you're deadlocked unrecoverably.)
I'm not sure why it's not working the way you have it set up, but I'll take a stab at this. I think what I would do is change the Popen call to:
controller = subprocess.Popen("ssh -T -x <hostname> \"sh -c 'cat > ~/pipes/pipename'\"", ...
And then simply controller.stdin.write('q').

How to send a stream with python fabric

I need to send the stdout stream of a program across a network, to the stdin of another program, running on another host.
This can be easily accomplished using ssh:
program 1 | ssh host 'program 2'
It's trivial to call this using subprocess:
subprocess.call("program 1 | ssh host 'program 2'", shell=True)
However, since I need to run many other commands on the remote host, I'm using fabric.
Sending files with fabric is easy enough, but I can't find any documentation on sending streams. I know fabric uses the paramiko ssh library, so I could use it's channel but there seems to be no documentation for accessing the channel from fabric.
I ended up digging through the fabric source code (fabric.operations._execute) and came up this:
from fabric.state import default_channel
import subprocess
def remote_pipe(local_command, remote_command, buf_size=1024):
'''executes a local command and a remove command (with fabric), and
sends the local's stdout to the remote's stdin.
based on fabric.operations._execute'''
local_p= subprocess.Popen(local_command, shell=True, stdout=subprocess.PIPE)
channel= default_channel() #fabric function
channel.set_combine_stderr(True)
channel.exec_command( remote_command )
read_bytes= local_p.stdout.read(buf_size)
while read_bytes:
channel.sendall(read_bytes)
read_bytes= local_p.stdout.read(buf_size)
local_ret= local_p.wait()
channel.shutdown_write()
received= channel.recv(640*1024) #ought to be enough for everyone
remote_ret= channel.recv_exit_status()
if local_ret!=0 or remote_ret!=0:
raise Exception("remote_pipe failed: "+received)

Make python enter password when running a csh script

I'm writing a python script that executes a csh script in Solaris 10. The csh script prompts the user for the root password (which I know) but I'm not sure how to make the python script answer the prompt with the password. Is this possible? Here is what I'm using to execute the csh script:
import commands
commands.getoutput('server stop')
Have a look at the pexpect module. It is designed to deal with interactive programs, which seems to be your case.
Oh, and remember that hard-encoding root's password in a shell or python script is potentially a security hole :D
Use subprocess. Call Popen() to create your process and use communicate() to send it text. Sorry, forgot to include the PIPE..
from subprocess import Popen, PIPE
proc = Popen(['server', 'stop'], stdin=PIPE)
proc.communicate('password')
You would do better do avoid the password and try a scheme like sudo and sudoers. Pexpect, mentioned elsewhere, is not part of the standard library.
import pexpect
child = pexpect.spawn('server stop')
child.expect_exact('Password:')
child.sendline('password')
print "Stopping the servers..."
index = child.expect_exact(['Server processes successfully stopped.', 'Server is not running...'], 60)
child.expect(pexpect.EOF)
Did the trick! Pexpect rules!
Add input= in proc.communicate() make it run, for guys who like to use standard lib.
from subprocess import Popen, PIPE
proc = Popen(['server', 'stop'], stdin=PIPE)
proc.communicate(input='password')
Should be able to pass it as a parameter. something like:
commands.getoutput('server stop -p password')
This seems to work better:
import popen2
(stdout, stdin) = popen2.popen2('server stop')
stdin.write("password")
But it's not 100% yet. Even though "password" is the correct password I'm still getting su: Sorry back from the csh script when it's trying to su to root.
To avoid having to answer the Password question in the python script I'm just going to run the script as root. This question is still unanswered but I guess I'll just do it this way for now.

Categories