I'm working on a little python program to speed up managing various Raspberry Pi servers over SSH. It's all done but for one thing. Interacting with an SSH session isn't wokring the way I want it to.
I can interact with a command but some commands (specifically apt full-upgrade) which ask or have potential to ask questions whilst they're running aren't working. So when it reaches the point where it asks do you want to continue [Y/n] it falls over. I believe it's because apt can't read from the stdin so aborts.
I know I could run the apt command with the -y flag and bypass the question but ideally I'd like to be able to capture requests and ask the user for input. I've been using Paramiko to manage my SSH sessions and what I'm doing is capturing the stdout and passing it to the find function to look for things like [Y/n] and if it finds that then redirect the user to an input prompt which works but because theres no stdin when apt asks the question it aborts and when I send my user input back to the SSH session I get a socket closed error.
I've been looking for alternatives or ways to get round the issue but apart from seeing fabric mentioned as an alternative to paramiko I can't see a lot of other options out there. Does anyone know of any alternatives I can try to paramiko. I don't think fabric will work for me given its based off paramiko so I assume I'd hit the same error there. I'd appreciate any recoomendations or pointers if there's other parts of Paramiko I can try (I've stuck to using exec_command). I have tried channels which work to a point but I don't think keeping the SSH session open is the issue I think I need someway to keep stdin open/accessible to the apt command on the remote machine so it doesn't abort the command.
At the minute the best idea I've got to get round it is to run the command let it potentially abort look in stdout for the relevant phrases then run the command again after giving the user chance to set their inputs and pass the whole lot to stdin?
EDIT:
My program in steps:
login to the remote host
issue a command
use .find on the command to check for the use of 'sudo'
if sudo is present additionally send the user password to stdin along with the command
read the stdout to check for keywords/phrases like '[Y/n]' which are present when the user is being asked for input when running a command
if a keyword is found then ask the user for their input which can then be sent back to stdin to continue the command.
Steps 5 and 6 are where it fails and returns with a socket closed error. Looking online I don't think the issue is with paramiko as such but with the command running on the remote host. In my case sudo apt full-upgrade.
When I run that command it runs up to the 'Would you like to continue' point the automatically aborts, I believe the issue there is because there is nothing present in the stdin at that point (thats what I'm asking the user for) Apt automatically aborts
This is the part of my code where I'm running the commands:
admin = issue_cmd.find('sudo')
connect.connect(ip_addr, port, uname, passwd)
stdin, stdout, stderr = connect.exec_command(issue_cmd, get_pty=True)
if admin != -1:
print('Sudo detected. Attempting to elevate privileges...')
stdin.write(passwd + '\n')
stdin.flush()
else:
continue
output = stdout.read()
search = str(output).find('[Y/n]')
if search != -1:
resp = input(': ')
print(resp)
stdin.write(resp + '\n')
stdin.flush()
else:
pass
print(stdout.read().decode('utf-8').strip("\n"))
print(stderr.read().decode('utf-8').strip("\n"))
connect.close()
and here's the error message I'm seeing:
OSError: Socket is closed
Related
This question already has an answer here:
Executing command using Paramiko exec_command on device is not working
(1 answer)
Closed 2 years ago.
I'm working on a GPS position retrieval project, I have to connect in SSH on routers, then launch commands to retrieve latitude and longitude.
I recently received new routers, when we connect to this router, we receive an "OK" signal when we are connected to ensure proper operation, then we run the command we want, and we get the data as in this example below, always followed by the "OK" message indicating that the command worked well :
AT*GNSSSTATUS?
Location Fix=1
Number of satellites = 14
Latitude=+49.17081
Longitude=-123.06970
Date=2016/02/29
Time= 18:55:28
TTFF=9449 milliSeconds
OK
When I connect in SSH with the help of PUTTY, it works, but when I use my code that sends the same command as mentioned above (AT*GNSSSTATUS?) through my Python script and the Paramiko library, the result is just "OK" which indicates that the connection is just active. It's like the command line opened by the script doesn't take the "ENTER" that should come next.
To test this, I tried to put a command returning "ERROR" in case I use PUTTY, but even in this case the Python script returns "OK".
To try to fix this I tried different options by adding :
stdin, stdout, stderr = client.exec_command('AT*GNSSSTATUS? \r\n')
or
stdin, stdout, stderr = client.exec_command('AT*GNSSSTATUS? <CR>')
But in no case does this change the result.
My data list contains only one string marked "OK".
For the connection part on the router everything works.
Anyone have any ideas?
Thanks a lot!
Sorry if there are spelling mistakes ahah.
Thanks Martin Prikryl !
So I looked at the link you sent me and it worked:
Executing command using Paramiko exec_channel on device is not working.
So I changed my code to use a shell and send my commands through it.
Here is my code
shell = client.invoke_shell()
shell.send('AT*GNSSSTATUS? \r')
Thank you very much and have a nice day
I'm writing a script that uses paramiko to ssh onto several remote hosts and run a few checks. Some hosts are setup as fail-overs for others and I can't determine which is in use until I try to connect. Upon connecting to one of these 'inactive' hosts the host will inform me that you need to connect to another 'active' IP and then close the connection after n seconds. This appears to be written to the stdout of the SSH connection/session (i.e. it is not an SSH banner).
I've used paramiko quite a bit, but I'm at a loss as to how to get this output from the connection, exec_command will obviously give me stdout and stderr, but the host is outputting this immediately upon connection, and it doesn't accept any other incoming requests/messages. It just closes after n seconds.
I don't want to have to wait until the timeout to move onto the next host and I'd also like to verify that that's the reason for not being able to connect and run the checks, otherwise my script works as intended.
Any suggestions as to how I can capture this output, with or without paramiko, is greatly appreciated.
I figured out a way to get the data, it was pretty straight forward to be honest, albeit a little hackish. This might not work in other cases, especially if there is latency, but I could also be misunderstanding what's happening:
When the connection opens, the server spits out two messages, one saying it can't chdir to a particular directory, then a few milliseconds later it spits out another message stating that you need to connect to the other IP. If I send a command immediately after connecting (doesn't matter what command), exec_command will interpret this second message as the response. So for now I have a solution to my problem as I can check this string for a known message and change the flow of execution.
However, if what I describe is accurate, then this may not work in situations where there is too much latency and the 'test' command isn't sent before the server response has been received.
As far as I can tell (and I may be very wrong), there is currently no proper way to get the stdout stream immediately after opening the connection with paramiko. If someone knows a way, please let me know.
(please, if you have an answer, please post a tested script, because I've tried the perfect theoretical script and it didn't work)
I have a cronjob in python that connects to an external server through a ssh tunnel to read and write a MySQL database there.
The python program opens a ssh tunnel and binds the mysql port to a local one, then connects to the external db through the pymysql library as if it were local.
Everything works fine, but... I couldn't make the tunnel to be opened and closes by the same program. So, I have to manually open a tunnel and let it open with the -f -N -T params to the shh.
It works ok, but I'd like that the progam could manage the tunnel without need of leaving a tunnel open. I couldn't get the right parameters for this.
I've read on Internet some answers that implies waiting for tunnel to open, and then shutting down by killing de process in the os. I don't like those solutions. In fact, I could do this in the past with this commands:
sys_command = "ssh -T -f user#external.server -i /home/user/.ssh/id_rsa -p 22222 -L 3310:localhost:3306"
prog = subprocess.Popen(sys_command, stdout=subprocess.PIPE , stderr=subprocess.PIPE , shell = True)
out,err = prog.communicate()
and then I connected to the db with:
con_test = pymysql.connect(user=conf.dbUser, passwd=conf.dbPass, host='localhost', port = 3310, db=conf.database)
It worked in another environment, but when I try it now I get:
Cannot fork into background without a command to execute.
I've tried with a lot of combinations of parameters. The only one that worked was:
ssh -f -N -T .....
But, as I said, this opens a tunnel indefinitely.
I've also tried using Popen as it should be (separating parameter per parameter with quotes and commas and with Shell = False):
sys_command = ["ssh","-f","-T","user#external.server","-i","/home/user/.ssh/id_rsa","-p","22222","-L","3310:localhost:3306"]
prog = subprocess.Popen( sys_command, stdout=subprocess.PIPE , stderr=subprocess.PIPE , shell = False)
but i get the same error.
But, oh surprise, if I make a trick that I red by there and I add "sleep","10" to the end:
sys_command = ["ssh","-f","-T","user#external.server","-i","/home/user/.ssh/id_rsa","-p","22222","-L","3310:localhost:3306","sleep","10"]
...I get the greeting from the other server through outerr!
and, below, the error:
bind: Cannot assign requested address
So, if someone knows a correct way to do it, please answer with a tested script, because many solutions I found in internet just didn't work in my environment.
Thanks in advance.
Is it possible to telnet to a server and from there telnet to another server in python?
Since there is a controller which I telnet into using a username and password, and from the controller command line I need to login as root to run linux command. How would I do that using python?
I use the telentlib to login into router controller but from the router controller I need to log in again to get into shell. Is this possible using python?
Thanks!
Just checked it with the hardware I have in hand & telnetlib. Saw no problem.
When you are connected to the first device just send all the necessary commands using telnet.write('cmd'). It may be sudo su\n, telnet 192.168.0.2\n or whatever else. Telnetlib keeps in mind only its own telnet connection, all secondary connections are handled by the corresponding controllers.
Have you looked into using expect (there should be a python binding); basically, what I think you want to do is:
From your python script, use telnetlib to connect to server A (pass in username/password).
Within this "socket", send the remaining commands, e.g. "telnet serverB" and use expect (or some other mechanism) to check that you get back the expected "User:" prompt; if so, send user and then password and then whatever commands, and otherwise handle errors.
This should be very much doable and is fairly common with older stuff that doesn't support a cleaner API.
You can use write() to issue the sudo command.
tn.write("sudo\n")
You could also use read_until() to help with the credentials.
tn.read_until("login: ")
tn.write(user + "\n")
if password:
tn.read_until("Password: ")
tn.write(password + "\n")
How do i retrieve the netstat -a data from my python script from a local server. I have tried subprocess.Popen(['ssh','server','pass','netstat','-a'],stdout=file1) but it does not work. Any advice?
You must have key based authorisation from the desired server where netstat is suppose to run.
In the absence on key based authorisation, the ssh command will return a prompt asking for password.
Below is the link for key based authorisation:
http://wp.uberdose.com/2006/10/16/ssh-automatic-login/
or
http://linuxproblem.org/art_9.html
Once the keys are exchanged, the command mentioned in your question does not require any password and no prompt will occur when you run it.
subprocess.Popen(['ssh','server','netstat','-a'],stdout=file1)