I am trying to write a script that logs onto a remote machine, runs a command and returns the output. I'm doing this in python, using the paramiko library. However, for some reason the full output isn't being produced, only a single line of it.
In an attempt to isolate the problem, I created a local script, called simple, which runs the command and sends the output to a file, remote_test_output.txt. Then I simply sftp the file over instead. The file only contained the same one line. The only line of output is the same every time: the response code of the command.
When I do this all manually (ssh over, log in, and run ./simple), it all works as intended and the output file is correct. However, doing it through the script on my machine, it only returns the single line.
my code:
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.load_host_keys(os.path.expanduser(os.path.join("~", ".ssh", "known_hosts")))
ssh.connect(host, username, password)
ssh_stdin, ssh_stdout, ssh_stderr = ssh.exec_command('LD_LIBRARY_PATH=/opt/bin ./simple\n')
print "output:", ssh_stdout.read()+"end" #Reading output of the executed command
print "err:", ssh_stderr.read()#Reading the error stream of the executed command
sftp = ssh.open_sftp()
sftp.get('remote_test_output.txt', 'local_test_output.txt')
sftp.close()
What is returned:
response code: 128
What should be returned:
field1:value1
field2:value2
response code: 128
field3:value3
field4:value4
etc
Does anyone have any ideas why the command I'm trying to call isn't outputting normally?
I have to include the LD_LIBRARY_PATH variable assignment or I get a library does not exist error.
According to paramiko's documentation, the "exec_command" method returns a Channel object. First question: did you try to set the bufsize parameter to one? The Channel object "behaves like a socket". So it is said in the documentation:
http://www.facebook.com/photo.php?fbid=149324975212197&set=a.128010850676943.34896.126277080850320&type=1&ref=nf
It means that recv() (and possibly read()) will only read the data that is in the read buffer. So whenever your read() call returns, it does not mean that the process was already executed on the remote side. You should use the exit_status_ready() method to check if your command was executed:
http://www.lag.net/paramiko/docs/paramiko.Channel-class.html#exit_status_ready
And only after that can you read all the data. Well, this is what I guess. I may be wrong, but right now I cannot test my theory.
Related
I am facing an issue with the Python Paramiko library in my Django Application
This is a function I have written to create an SFTP connection:
def createSFTPConnection(request,temp_pwd):
ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
user = User.objects.get(username=request.user)
ssh.connect(hostname=temp_host,username=user.username,password=temp_pwd,port=22)
sftp_client=ssh.open_sftp()
return ssh,user,sftp_client
This just returns me the variable for ssh, the username, and sftp_client
Then I execute a command on the remote server using this code -
ssh,user,sftp_client=createSFTPConnection(request,temp_pwd) # passes the password on that server for the user for creating the connection
cmd_to_execute="(cd "+temporary_path+"; sh temp.sh"+" "+var1+" "+var2+")" # executing a shell script by passing it 2 variables
stdin, stdout, stderr = ssh.exec_command(cmd_to_execute) # actual call
print("stderr: ", stderr.readlines())
print("pwd: ", stdout.readlines())
Now, this code works fine and executes the script "temp.sh" on the remote server, but it takes a lot of time as I am returning stdin, stdout and stderr and printing them on the console
But, since I don't want that I removed the readlines() calls from there making my code look like this -
cmd_to_execute="(cd "+temporary_path+"; sh temp.sh"+" "+var1+" "+var2+")"
stdin, stdout, stderr = ssh.exec_command(cmd_to_execute) # actual call
But for some reason, this code just doesn't execute on the remote server after removing the readlines() calls
Thus, making me think that exec_command does not work without a readlines() call ahead of it
And I don't know why this is happening..
Any help would be highly appreciable!!
Thanks!!
For your info -
This is the Django code after the readlines() calls -
usr_msg="Your file has been uploaded successfully!! This is your variable: "+var1
messages.success(request, usr_msg)
ssh.close()
sftp_client.close()
return redirect("/execute/all")
The SSHClient.exec_command only starts an execution of the command. If you do not wait for it to complete and immediately kill the session, the command is killed along with it.
If you want to keep the command running even after you disconnect, you need to detach it from the session.
See Running process of remote SSH server in the background using Python Paramiko.
It's basically not a Python/Paramiko question, see also Getting ssh to execute a command in the background on target machine.
First, make it working in ssh/plink/whatever-client before trying to implement it in Python. Something like:
ssh user#example.com "cd path; nohup sh script.sh /dev/null 2>&1 &"
I haven't been able to resolve this issue, but I suspect it's easy for someone familiar with Paramiko/ssh2 to figure out.
The code below works fine when executed only once, but when wrapped in a while loop it hangs on stdout.read(). I could not use exec_command because it was not returning the correct output (the device I am SSHing into is not a standard microcontroller, and I'm still uncertain exactly what encoding or ssh parameters it uses). Since this worked, I wanted to query the device continously, but it didn't work when wrapping the commands in a while loop.
I also tried changing how the while loop was wrapped, including wrapping the whole code block starting with the intial SSH connection, wrapping around channel.close, etc.
import paramiko
import time
freewave_shell = paramiko.SSHClient()
freewave_shell.set_missing_host_key_policy(paramiko.AutoAddPolicy())
freewave_shell.connect("an.ip.add.ress", username="user", password="pass")
chan = freewave_shell.invoke_shell()
while (1)
stdin = chan.makefile_stdin('wb')
stdout = chan.makefile('rb')
stdin.write('''
signalLevel
noiseLevel
signalMargin
VSWR
exit
''')
print('HERE')
print(stdout.read())
stdout.close()
stdin.close()
chan.close()
freewave_shell.close()
I do not think your code is anywhere near reliable.
But what's the primary issue is that if you close the I/O, you have to reconnect the channel. So you have to move the invoke_shell call into the loop.
I'm trying to create a function that connects to a machine from another machine via minicom. After connecting to minicom enter should be pressed in order to send commands to the machine connected by minicom. My python code is as follows:
ssh = paramiko.SSHClient()
ssh.load_system_host_keys()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(self.serialHost, username=self.username, password=self.password)
shell = ssh.invoke_shell()
shell.send('minicom free -o')
shell.send('\u000d')
ssh.close()
Can someone tell me how can I send the enter key correctly?
Typically, when trying to execute commands in paramiko you don't have to invoke a shell and can just call ssh.exec_command(...). If the command you want to execute depends on the environment that starting a shell would give you, then you have to explicitly call the invoke_shell() method.
When you use invoke_shell() in paramiko, you'll have to send the line termination character(s) that the particular shell expects. If the machine you're ssh'ing to has bash as it's default shell, you have to send a newline (i.e. '\n') character after each command. For example:
shell.send('ls\n')
instead of
shell.send('ls')
If you're connecting to an older Windows machine, you need to send both a carriage return and a newline (i.e. '\r\n') for the command to be processed.
you can try '\r\n' for the return key, but I think there is also an exec_command method call that would not require for the return key. Something like.
shell.exec_command('minicom free -o')
source
I am trying to execute a remote command on an ec2 instance that needs sudo.
example code snippet
conn = boto.ec2.connect_to_region(....)
instance = conn.get_only_instances(instance_ids=instance_id)[0]
ssh_client = sshclient_from_instance(instance, ssh_key_file='path.to.pem,user_name='ec2-user')
chan = ssh_client.run_pty('sudo ls /root')
Using just ssh_client.run() returns a tuple that was easy to deal with but doesn't allow sudo. run_pty is returning paramiko.channel.Channel and I can use recv() to get some output back but I am not clear how to get the entire stdout.
I'm a new paramiko user and am having difficulty running commands on a remote server with paramiko. I want to export a path and also run a program called tophat in the background. I can login fine with paramiko.sshclient() but my code to exec_command has no results.
stdin, stdout, sterr = ssh.exec_command('export PATH=$PATH:/proj/genome/programs
/tophat-1.3.0/bin:/proj/genome/programs/cufflinks-1.0.3/bin:/proj/genome/programs/
bowtie-0.12.7:/proj/genome/programs/samtools-0.1.16')
stdin, stdout, sterr = ssh.exec_command('nohup tophat -o /output/path/directory -I
10000 -p 8 --microexon-search -r 50 /proj/genome/programs/bowtie-0.12.7/indexes
/ce9 /input/path/1 /input/path/2 &')
there is no nohup.out file and python just goes to the next line with no error messages. I have tried without nohup as well and the result is the same. I was trying to follow this paramiko tutorial.
am I using exec_command incorrectly?
I also ran into the same issue and after looking at this article and this answer, I see the solution is to call the recv_exit_status() method of the Channel. Here is my code:
import paramiko
import time
cli = paramiko.client.SSHClient()
cli.set_missing_host_key_policy(paramiko.client.AutoAddPolicy())
cli.connect(hostname="10.66.171.100", username="mapping")
stdin_, stdout_, stderr_ = cli.exec_command("ls -l ~")
# time.sleep(2) # Previously, I had to sleep for some time.
stdout_.channel.recv_exit_status()
lines = stdout_.readlines()
for line in lines:
print line
cli.close()
Now my code will be blocked until the remote command is finished. This method is explained here, and please pay some attention to the warning.
exec_command() is non blocking, and it just sends the command to the server then Python will run the following code.
I think you should wait for the command execution ends and do the rest work after that.
"time.sleep(10)" could help which requires "import time".
Some examples show that you could read from the stdout ChannelFile object, or simply using stdout.readlines(), it seems to read all the response from the server, guess this could help.
Your code, the above 2 lines of exec_command, they're actually running in different exec sessions. I'm not sure if this has some impact in your case.
I'd suggest you take a look at the demos in the demos folder, they're using Channel class, which has better API to do blocking / nonblocking sending for both shell and exec.
You better to load the bash_profile before you run your command. Otherwise you may get a 'command not found' exception.
For example,I write the command command = 'mysqldump -uu -pp -h1.1.1.1 -P999 table > table.sql' in the purpose of dumping a Mysql table
Then I have to load the bash_profile manually before that dumping command by typing . ~/.profile; .~/.bash_profile;.
Example
my_command = 'mysqldump -uu -pp -h1.1.1.1 -P999 table > table.sql;'
pre_command = """
. ~/.profile;
. ~/.bash_profile;
"""
command = pre_command + my_command
stdin, stdout, stderr = ssh.exec_command(command)