send enter key on python Paramiko ssh - python

I'm trying to create a function that connects to a machine from another machine via minicom. After connecting to minicom enter should be pressed in order to send commands to the machine connected by minicom. My python code is as follows:
ssh = paramiko.SSHClient()
ssh.load_system_host_keys()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(self.serialHost, username=self.username, password=self.password)
shell = ssh.invoke_shell()
shell.send('minicom free -o')
shell.send('\u000d')
ssh.close()
Can someone tell me how can I send the enter key correctly?

Typically, when trying to execute commands in paramiko you don't have to invoke a shell and can just call ssh.exec_command(...). If the command you want to execute depends on the environment that starting a shell would give you, then you have to explicitly call the invoke_shell() method.
When you use invoke_shell() in paramiko, you'll have to send the line termination character(s) that the particular shell expects. If the machine you're ssh'ing to has bash as it's default shell, you have to send a newline (i.e. '\n') character after each command. For example:
shell.send('ls\n')
instead of
shell.send('ls')
If you're connecting to an older Windows machine, you need to send both a carriage return and a newline (i.e. '\r\n') for the command to be processed.

you can try '\r\n' for the return key, but I think there is also an exec_command method call that would not require for the return key. Something like.
shell.exec_command('minicom free -o')
source

Related

Python Paramiko "exec_command" does not execute - Django

I am facing an issue with the Python Paramiko library in my Django Application
This is a function I have written to create an SFTP connection:
def createSFTPConnection(request,temp_pwd):
ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
user = User.objects.get(username=request.user)
ssh.connect(hostname=temp_host,username=user.username,password=temp_pwd,port=22)
sftp_client=ssh.open_sftp()
return ssh,user,sftp_client
This just returns me the variable for ssh, the username, and sftp_client
Then I execute a command on the remote server using this code -
ssh,user,sftp_client=createSFTPConnection(request,temp_pwd) # passes the password on that server for the user for creating the connection
cmd_to_execute="(cd "+temporary_path+"; sh temp.sh"+" "+var1+" "+var2+")" # executing a shell script by passing it 2 variables
stdin, stdout, stderr = ssh.exec_command(cmd_to_execute) # actual call
print("stderr: ", stderr.readlines())
print("pwd: ", stdout.readlines())
Now, this code works fine and executes the script "temp.sh" on the remote server, but it takes a lot of time as I am returning stdin, stdout and stderr and printing them on the console
But, since I don't want that I removed the readlines() calls from there making my code look like this -
cmd_to_execute="(cd "+temporary_path+"; sh temp.sh"+" "+var1+" "+var2+")"
stdin, stdout, stderr = ssh.exec_command(cmd_to_execute) # actual call
But for some reason, this code just doesn't execute on the remote server after removing the readlines() calls
Thus, making me think that exec_command does not work without a readlines() call ahead of it
And I don't know why this is happening..
Any help would be highly appreciable!!
Thanks!!
For your info -
This is the Django code after the readlines() calls -
usr_msg="Your file has been uploaded successfully!! This is your variable: "+var1
messages.success(request, usr_msg)
ssh.close()
sftp_client.close()
return redirect("/execute/all")
The SSHClient.exec_command only starts an execution of the command. If you do not wait for it to complete and immediately kill the session, the command is killed along with it.
If you want to keep the command running even after you disconnect, you need to detach it from the session.
See Running process of remote SSH server in the background using Python Paramiko.
It's basically not a Python/Paramiko question, see also Getting ssh to execute a command in the background on target machine.
First, make it working in ssh/plink/whatever-client before trying to implement it in Python. Something like:
ssh user#example.com "cd path; nohup sh script.sh /dev/null 2>&1 &"

Force password authentication (ignore keys in .ssh folder) in Paramiko in Python

I'm trying to write a small Python program to check whether an SSH server allows a password authentication. Here is the current plan:
import base64
import paramiko
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect('ssh.example.com', username='strongbad', password='thecheat')
stdin, stdout, stderr = client.exec_command('ls')
for line in stdout:
print('... ' + line.strip('\n'))
client.close()
The idea is to grep the output or to later put a try catch block around the connect statement.
My problem however is that some of the systems that I run the program on have access via a RSA key that is stored under ~/.ssh. And in these cases, the connect will simply succeed (which I want to avoid).
So, here is the question: Does anybody know any way to force Paramiko (or another SSH client) to use passwords?
Thanks
The SSHClient.connect method has look_for_keys argument. Set it to False:
client.connect(
'ssh.example.com', username='strongbad', password='thecheat',
look_for_keys=False)
Similarly you may want to set allow_agent to False as well.
Obligatory warning: Do not use AutoAddPolicy, unless you do not care about security. You are losing a protection against MITM attacks this way.
For a correct solution, see Paramiko "Unknown Server".

SSH and exec channels with python shell

We have implemented a python shell for our hardware devices that solely consists of the python cmd module on embedded linux. Our (non-root) user's shell is set to the path of this python shell in /etc/passwd and /etc/shadow. Code example below:
#!/usr/bin/env python
import cmd
class OurCmdProcessor(cmd.Cmd, object):
def __init__(self):
cmd.Cmd.__init__(self)
...
def cmdloop(self, intro = None):
"""Command loop.
Overrides cmd class cmdloop() method.
"""
signal.signal(signal.SIGINT, self._sigint_handler)
try:
cmd.Cmd.cmdloop(self, intro = intro)
except:
print("{} v2 exception!".format(branding))
#traceback.print_exc(file=sys.stdout)
sys.stdout.flush()
# This *exits* the cmd shell on exception
self.do_EOF()
def do_help(self):
print("Here is some help text!")
etc...
Previously, one of our clients had used SSH.NET to issue command line commands using that library's RunCommand function, which sets up a standard SSH 'exec' request to go over the SSH connection, and then parses the output and return value. (i.e. request channel, channel success, send command, etc..)
Now, that call doesn't work, presumably because we've switched from /bin/sh to this python shell. What does work is using that library's SSH Shell object to send commands over by putting the command text followed by a newline, then scanning the output.
What I'm asking is, is it possible to implement something in that shell to handle the standard SSH 'exec' command this library is issuing, or does part of the shell output that is received upon executing a command AFTER issuing an SSH shell request include the exit value already? We don't want to include exit values as part of the command printable output.
We are using dropbear SSH server on our embedded linux device.
To work in this mode, your Python script should be able to parse its command-line arguments in the same way it parses arguments given interactively to your Cmd instance.
That is:
./yourpython -c "some command"
should work identically to:
./yourpython <<EOF
some command
EOF
...and should have an exit status that reflects whether the last command to be executed succeeded.
This is equivalent to how ssh hostname 'some command' runs "${SHELL:-sh}" -c 'some command' on the remote host.

How to send a stream with python fabric

I need to send the stdout stream of a program across a network, to the stdin of another program, running on another host.
This can be easily accomplished using ssh:
program 1 | ssh host 'program 2'
It's trivial to call this using subprocess:
subprocess.call("program 1 | ssh host 'program 2'", shell=True)
However, since I need to run many other commands on the remote host, I'm using fabric.
Sending files with fabric is easy enough, but I can't find any documentation on sending streams. I know fabric uses the paramiko ssh library, so I could use it's channel but there seems to be no documentation for accessing the channel from fabric.
I ended up digging through the fabric source code (fabric.operations._execute) and came up this:
from fabric.state import default_channel
import subprocess
def remote_pipe(local_command, remote_command, buf_size=1024):
'''executes a local command and a remove command (with fabric), and
sends the local's stdout to the remote's stdin.
based on fabric.operations._execute'''
local_p= subprocess.Popen(local_command, shell=True, stdout=subprocess.PIPE)
channel= default_channel() #fabric function
channel.set_combine_stderr(True)
channel.exec_command( remote_command )
read_bytes= local_p.stdout.read(buf_size)
while read_bytes:
channel.sendall(read_bytes)
read_bytes= local_p.stdout.read(buf_size)
local_ret= local_p.wait()
channel.shutdown_write()
received= channel.recv(640*1024) #ought to be enough for everyone
remote_ret= channel.recv_exit_status()
if local_ret!=0 or remote_ret!=0:
raise Exception("remote_pipe failed: "+received)

Python SSH not giving full output

I am trying to write a script that logs onto a remote machine, runs a command and returns the output. I'm doing this in python, using the paramiko library. However, for some reason the full output isn't being produced, only a single line of it.
In an attempt to isolate the problem, I created a local script, called simple, which runs the command and sends the output to a file, remote_test_output.txt. Then I simply sftp the file over instead. The file only contained the same one line. The only line of output is the same every time: the response code of the command.
When I do this all manually (ssh over, log in, and run ./simple), it all works as intended and the output file is correct. However, doing it through the script on my machine, it only returns the single line.
my code:
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.load_host_keys(os.path.expanduser(os.path.join("~", ".ssh", "known_hosts")))
ssh.connect(host, username, password)
ssh_stdin, ssh_stdout, ssh_stderr = ssh.exec_command('LD_LIBRARY_PATH=/opt/bin ./simple\n')
print "output:", ssh_stdout.read()+"end" #Reading output of the executed command
print "err:", ssh_stderr.read()#Reading the error stream of the executed command
sftp = ssh.open_sftp()
sftp.get('remote_test_output.txt', 'local_test_output.txt')
sftp.close()
What is returned:
response code: 128
What should be returned:
field1:value1
field2:value2
response code: 128
field3:value3
field4:value4
etc
Does anyone have any ideas why the command I'm trying to call isn't outputting normally?
I have to include the LD_LIBRARY_PATH variable assignment or I get a library does not exist error.
According to paramiko's documentation, the "exec_command" method returns a Channel object. First question: did you try to set the bufsize parameter to one? The Channel object "behaves like a socket". So it is said in the documentation:
http://www.facebook.com/photo.php?fbid=149324975212197&set=a.128010850676943.34896.126277080850320&type=1&ref=nf
It means that recv() (and possibly read()) will only read the data that is in the read buffer. So whenever your read() call returns, it does not mean that the process was already executed on the remote side. You should use the exit_status_ready() method to check if your command was executed:
http://www.lag.net/paramiko/docs/paramiko.Channel-class.html#exit_status_ready
And only after that can you read all the data. Well, this is what I guess. I may be wrong, but right now I cannot test my theory.

Categories