How to send a stream with python fabric - python

I need to send the stdout stream of a program across a network, to the stdin of another program, running on another host.
This can be easily accomplished using ssh:
program 1 | ssh host 'program 2'
It's trivial to call this using subprocess:
subprocess.call("program 1 | ssh host 'program 2'", shell=True)
However, since I need to run many other commands on the remote host, I'm using fabric.
Sending files with fabric is easy enough, but I can't find any documentation on sending streams. I know fabric uses the paramiko ssh library, so I could use it's channel but there seems to be no documentation for accessing the channel from fabric.

I ended up digging through the fabric source code (fabric.operations._execute) and came up this:
from fabric.state import default_channel
import subprocess
def remote_pipe(local_command, remote_command, buf_size=1024):
'''executes a local command and a remove command (with fabric), and
sends the local's stdout to the remote's stdin.
based on fabric.operations._execute'''
local_p= subprocess.Popen(local_command, shell=True, stdout=subprocess.PIPE)
channel= default_channel() #fabric function
channel.set_combine_stderr(True)
channel.exec_command( remote_command )
read_bytes= local_p.stdout.read(buf_size)
while read_bytes:
channel.sendall(read_bytes)
read_bytes= local_p.stdout.read(buf_size)
local_ret= local_p.wait()
channel.shutdown_write()
received= channel.recv(640*1024) #ought to be enough for everyone
remote_ret= channel.recv_exit_status()
if local_ret!=0 or remote_ret!=0:
raise Exception("remote_pipe failed: "+received)

Related

Paramiko exec_command only works when printing exit status [duplicate]

This question already has an answer here:
Paramiko SSH exec_command (shell script) returns before completion
(1 answer)
Closed 1 year ago.
I am writing a program in python on Ubuntu. In that program I am trying to print a message after completing a task "Delete a File" on Remote machine (RaspberryPi), connected to network.
But In actual practice, print command is not waiting till completion of task on remote machine.
Can anybody guide me on how do I do that?
My Coding is given below
import paramiko
# Connection with remote machine
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect('192.168.2.34', username='pi', password='raspberry')
filename = 'fahad.txt'
filedelete ='rm ' + filename
stdin, stdout, stderr = client.exec_command(filedelete)
print ("File Deleted")
client.close()
This is indeed a duplicate of paramiko SSH exec_command(shell script) returns before completion, but the answer there is not terribly detailed. So...
As you noticed, exec_command is a non-blocking call. So you have to wait for completion of the remote command by using either:
Channel.exit_status_ready if you want a non-blocking check of the command completion (i.e.: pooling)
Channel.recv_exit_status if you want to block until the command completion (and get back the exit status — an exit status of 0 means normal completion).
In your particular case, you need the later:
stdin, stdout, stderr = client.exec_command(filedelete) # Non-blocking call
exit_status = stdout.channel.recv_exit_status() # Blocking call
if exit_status == 0:
print ("File Deleted")
else:
print("Error", exit_status)
client.close()
In addition to doing what Sylvian Leroux suggests:
If your commands involve running a bash script that needs to keep running after paramiko closes the ssh session (every time you send a command this happens) use:
nohup ./my_bash_script.sh >/dev/null 2>&1.
nohup tells the system that this process should ignore the "hang up" signal received when the ssh session is closed.
>/dev/null 2>&1 redirects the output. This is necessary because in some situations control will not be given back to your python script until an output is received.
To run command line applications like "stress" and "vlc" and keep them running after you return, the only solution I have found is to put your commands in a bash script followed by a & or &>/dev/null then call that bash script with paramiko using the method I mention in the previous paragraph.
This seems a bit "hacky" but it is the only way I have found after days of searching.

Python Paramiko "exec_command" does not execute - Django

I am facing an issue with the Python Paramiko library in my Django Application
This is a function I have written to create an SFTP connection:
def createSFTPConnection(request,temp_pwd):
ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
user = User.objects.get(username=request.user)
ssh.connect(hostname=temp_host,username=user.username,password=temp_pwd,port=22)
sftp_client=ssh.open_sftp()
return ssh,user,sftp_client
This just returns me the variable for ssh, the username, and sftp_client
Then I execute a command on the remote server using this code -
ssh,user,sftp_client=createSFTPConnection(request,temp_pwd) # passes the password on that server for the user for creating the connection
cmd_to_execute="(cd "+temporary_path+"; sh temp.sh"+" "+var1+" "+var2+")" # executing a shell script by passing it 2 variables
stdin, stdout, stderr = ssh.exec_command(cmd_to_execute) # actual call
print("stderr: ", stderr.readlines())
print("pwd: ", stdout.readlines())
Now, this code works fine and executes the script "temp.sh" on the remote server, but it takes a lot of time as I am returning stdin, stdout and stderr and printing them on the console
But, since I don't want that I removed the readlines() calls from there making my code look like this -
cmd_to_execute="(cd "+temporary_path+"; sh temp.sh"+" "+var1+" "+var2+")"
stdin, stdout, stderr = ssh.exec_command(cmd_to_execute) # actual call
But for some reason, this code just doesn't execute on the remote server after removing the readlines() calls
Thus, making me think that exec_command does not work without a readlines() call ahead of it
And I don't know why this is happening..
Any help would be highly appreciable!!
Thanks!!
For your info -
This is the Django code after the readlines() calls -
usr_msg="Your file has been uploaded successfully!! This is your variable: "+var1
messages.success(request, usr_msg)
ssh.close()
sftp_client.close()
return redirect("/execute/all")
The SSHClient.exec_command only starts an execution of the command. If you do not wait for it to complete and immediately kill the session, the command is killed along with it.
If you want to keep the command running even after you disconnect, you need to detach it from the session.
See Running process of remote SSH server in the background using Python Paramiko.
It's basically not a Python/Paramiko question, see also Getting ssh to execute a command in the background on target machine.
First, make it working in ssh/plink/whatever-client before trying to implement it in Python. Something like:
ssh user#example.com "cd path; nohup sh script.sh /dev/null 2>&1 &"

Erlang: port to Python instance not responding

I am trying to communicate to an external python process through an Erlang port. First, a port is opened, then a message is sent to the external process via stdin. I am expecting a corresponding reply on the process's stdout.
My attempt looks like this:
% open a port
Port = open_port( {spawn, "python -u -"},
[exit_status, stderr_to_stdout, {line, 1000000}] ).
% send a command to the port
true = port_command( Port, "print( \"Hello world.\" )\n" ).
% gather response
% PROBLEM: no matter how long I wait flushing will return nothing
flush().
% close port
true = port_close( Port ).
% still nothing
flush().
I realize that someone else on Stackoverflow tried to do something similar but the proposed solution apparently doesn't work for me.
Also, I see that a related post on Erlang Central is starting a Python script through an Erlang port but it is not the Python shell itself that is invoked.
I have taken notice of ErlPort but I have a whole script to be executed in Python. If possible, I wouldn't want to break up the script into single Python calls.
Funny enough, doing it with bash is no problem:
Port = open_port( {spawn, "bash"},
[exit_status, stderr_to_stdout, {line, 1000000}] ).
true = port_command( Port, "echo \"Hello world.\"\n" ).
So the above example gives me a "Hello world." on flushing:
3> flush().
Shell got {#Port<0.544>,{data,{eol,"Hello world."}}}
ok
Just what I wanted to see.
Ubuntu 15.04 64 bit
Erlang 18.1
Python 2.7.9
Edit:
I have finally decided to write a script file (with a shebang) to disk and execute the script file instead of piping the script to the language interpreter for some languages (like Python).
I suspect, the problem has to do with the way some interpreters buffer IO, which I just can't work around, making necessary this extra round to disk.
As you've discovered, ports don't do what you'd like for this problem, which is why alternatives like ErlPort exist. An old workaround for this problem is to use netcat to pipe commands into python so that a proper EOF occurs. Here's an example session:
1> PortOpts = [exit_status, stderr_to_stdout, {line,1000000}].
[exit_status,stderr_to_stdout,{line,1000000},use_stdio]
2> Port = open_port({spawn, "nc -l 51234 | python"}, PortOpts).
#Port<0.564>
3> {ok, S} = gen_tcp:connect("localhost", 51234, []).
{ok,#Port<0.565>}
4> gen_tcp:send(S, "print 'hello'\nprint 'hello again'\n").
ok
5> gen_tcp:send(S, "print 'hello, one more time'\n").
ok
6> gen_tcp:close(S).
ok
7> flush().
Shell got {#Port<0.564>,{data,{eol,"hello"}}}
Shell got {#Port<0.564>,{data,{eol,"hello again"}}}
Shell got {#Port<0.564>,{data,{eol,"hello, one more time"}}}
Shell got {#Port<0.564>,{exit_status,0}}
ok
This approach opens a port running netcat as a listener on port 51234 — you can choose whatever port you wish to, of course, as long as its not already in use — with its output piped into python. We then connect to netcat over the local TCP loopback and send python command strings into it, which it then forwards through its pipe to python. Closing the socket causes netcat to exit, which results in an EOF on python's stdin, which in turn causes it to execute the commands we sent it. Flushing the Erlang shell message queue shows we got the results we expected from python via the Erlang port.

Pseudo terminal will not be allocated error - ssh - sudo - websocket - subprocess

I basically want to create a web page through which a unix terminal at the server side can be reached and commands can be sent to and their results can be received from the terminal.
For this, I have a WSGIServer. When a connection is opened, I execute the following:
def opened(self):
self.p = Popen(["bash", "-i"], bufsize=1, stdin=PIPE, stdout=PIPE, stderr=STDOUT)
self.p.stdout = Unbuffered(self.p.stdout)
self.t = Thread(target=self.listen_stdout)
self.t.daemon = True
self.t.start()
When a message comes to the server from the client, It is handled in the following function, which only redirects the coming message to the stdin of subprocess p which is an interactive bash:
def received_message(self, message):
print(message.data, file=self.p.stdin)
Then outputs of the bash is read in the following function within a separate thread t. It only sends the outputs to the client.
def listen_stdout(self):
while True:
c = self.p.stdout.read(1)
self.send(c)
In such a system, I am able to send any command(ls, cd, mkdir etc.) to the bash working at the server side and receive their outputs. However, when I try to run ssh xxx#xxx, the error pseudo-terminal will not be allocated because stdin is not a terminal is shown.
Also, in a similar way, when I run sudo ..., the prompt for password is not sent to the client somehow, but it appears on the terminal of the server script, instead.
I am aware of expect; however, only for such sudo and ssh usage, I do not want to mess my code up with expect. Instead, I am looking for a general solution that can fake sudo and ssh and redirect prompt's to the client.
Is there any way to solve this? Ideas are appreciated, thanks.
I found the solution. What I need was creating a pseudo-terminal. And, at the slave side of the tty, make a setsid() call to make this process a new session and run commands on it.
Details are here:
http://www.rkoucha.fr/tech_corner/pty_pdip.html

Wait until task is completed on Remote Machine through Python [duplicate]

This question already has an answer here:
Paramiko SSH exec_command (shell script) returns before completion
(1 answer)
Closed 1 year ago.
I am writing a program in python on Ubuntu. In that program I am trying to print a message after completing a task "Delete a File" on Remote machine (RaspberryPi), connected to network.
But In actual practice, print command is not waiting till completion of task on remote machine.
Can anybody guide me on how do I do that?
My Coding is given below
import paramiko
# Connection with remote machine
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect('192.168.2.34', username='pi', password='raspberry')
filename = 'fahad.txt'
filedelete ='rm ' + filename
stdin, stdout, stderr = client.exec_command(filedelete)
print ("File Deleted")
client.close()
This is indeed a duplicate of paramiko SSH exec_command(shell script) returns before completion, but the answer there is not terribly detailed. So...
As you noticed, exec_command is a non-blocking call. So you have to wait for completion of the remote command by using either:
Channel.exit_status_ready if you want a non-blocking check of the command completion (i.e.: pooling)
Channel.recv_exit_status if you want to block until the command completion (and get back the exit status — an exit status of 0 means normal completion).
In your particular case, you need the later:
stdin, stdout, stderr = client.exec_command(filedelete) # Non-blocking call
exit_status = stdout.channel.recv_exit_status() # Blocking call
if exit_status == 0:
print ("File Deleted")
else:
print("Error", exit_status)
client.close()
In addition to doing what Sylvian Leroux suggests:
If your commands involve running a bash script that needs to keep running after paramiko closes the ssh session (every time you send a command this happens) use:
nohup ./my_bash_script.sh >/dev/null 2>&1.
nohup tells the system that this process should ignore the "hang up" signal received when the ssh session is closed.
>/dev/null 2>&1 redirects the output. This is necessary because in some situations control will not be given back to your python script until an output is received.
To run command line applications like "stress" and "vlc" and keep them running after you return, the only solution I have found is to put your commands in a bash script followed by a & or &>/dev/null then call that bash script with paramiko using the method I mention in the previous paragraph.
This seems a bit "hacky" but it is the only way I have found after days of searching.

Categories