Problem
I've got a program using "Pexpect" to send commands over SSH. Its purpose is to SSH into a Panduit PDU (smartzone G5) and turn off port 1.
I can SSH into the PDU and run dev outlet 1 1 off and get the port to turn off, but it doesn't work when I try to run the same command from my program. It puzzles me as this method is no different than me logging in via SSH using iterm and typing. Is this a me problem or just how the PDU behaves?
What I've Tried
I've tested pexpect thoroughly on a Linux server using mkdir and touch to create directories and files so I know it works without issue there, but when I use it on the Panduit PDU the PDU ignores the commands.
I use time.sleep() to allow the PDU to catch up as it is a slow system and seems to have a queue for multiple commands.
It's not a module issue or interpreter issue or anything silly like that; the program itself is fine. (Though I'm sure it's not up to standards or safe but who cares if I can't get the program to work in the first place seriously.)
Why Pexpect?
I must use Pexpect as Paramiko and Fabric don't work for this particular connection.
I must specify ssh -F /dev/null admin#ipaddress otherwise the PDU rejects the SSH session and I cannot get pxssh or fabric or paramiko to do this so that's why I don't use them.
I'm aware of stuff like .set_missing_host_key_policy(paramiko.AutoAddPolicy()) for Paramiko and s = pxssh.pxssh(options={"StrictHostKeyChecking": "no", "UserKnownHostsFile": "/dev/null"}) and they do not work. Hence Pexpect.
Code
import os
import pexpect
import time
import sys
server_ip = "192.168.0.1"
server_user = "admin"
server_pass = "password"
child = pexpect.spawn('bash')
child.logfile_read = sys.stdout.buffer
child.expect('')
# This is the only way to access the PDU, Paramiko wont work either.
child.sendline('ssh -F /dev/null %s#%s -oStrictHostKeyChecking=no' % (server_user, server_ip))
child.expect("admin#192.168.0.1's password:")
child.sendline(server_pass)
child.expect('PANDUIT>')
time.sleep(3) # Wait for PDU to catch up or it wont recognise the command
child.sendline('dev outlet 1 1 off') # Power off port 1
print('\nfinished') # \n is there otherwise it gets sucked into the stdout
For reference this is what happens when I use Iterm to manually connect and enter commands, it works perfectly.
I've tried to use child.expect('SUCCESS') but it can't be seen. The only thing pexpect seems to be able to find inside the PDU is child.expect('\n') which I think may be from a buffer or something.
~/Documents ❯ ssh -F /dev/null admin#192.168.0.1
admin#192.168.0.1's password:
Type ? for command listing
sys PDU system configure and setting
net PDU net application configure and setting
usr PDU user operation
dev PDU device setting
pwr PDU power setting
PANDUIT>dev outlet 1 1 off
SUCCESS
And this is what my code outputs:
~/Documents❯ python3 code.py
ssh -F /dev/null admin#192.168.0.1 -oStrictHostKeyChecking=no
The default interactive shell is now zsh.
To update your account to use zsh, please run `chsh -s /bin/zsh`.
For more details, please visit https://support.apple.com/kb/HT208050.
bash-3.2$ ssh -F /dev/null admin#192.168.0.1 -oStrictHostKeyChecking=no
admin#192.168.0.1's password:
Type ? for command listing
sys PDU system configure and setting
net PDU net application configure and setting
usr PDU user operation
dev PDU device setting
pwr PDU power setting
PANDUIT>
finished
Related
I am writing a command line app in python using the click module that will SSH to Cisco network device and send configuration commands over the connection using the netmiko module. The problem I'm running into is that SSH-ing to the network device requires a hostname/IP, username, and password. I am trying to implement a way for a user of my script to login to a device once and keep the SSH connection open, allowing subcommands to be run without logging in each time. For example,
$ myapp ssh
hostname/IP: 10.10.110.10
username: user
password: ********
Connected to device 10.10.101.10
$ myapp command1
log output
$ myapp --option command2
log output
$ myapp disconnect
closing connection to 10.10.101.10
How would I go about storing/handling credentials to allow this functionality in my cli? I have seen recommendations of caching or OAuth in researching this issue, but I'm still not sure how to implement this or what the recommended and safe way to do this is.
Perhaps you are attempting something like this.
$ myapp ssh -u user -p password
(myapp) command1
(myapp) command2
(myapp) disconnect
$
Python has a standard library module cmd that may help:
https://docs.python.org/3.5/library/cmd.html
I have two Raspberry Pi's. I am trying to transfer files from one Pi to the other using scp. I am trying to do this through Python because the program that will be transferring files is a python file.
below is the shell script I have for the SCP part (Blurred out the pass and IP):
#!/bin/sh
sshpass -p ######## scp test.txt pi#IP:/home/pi
and below is the Python Script that launches that Shell script.
import subprocess
subprocess.call(['./ssh.sh'])
print("DONE")
For some reason the python script doesnt kick back any errors and hits the print line but the file is not transferred. When i run the scp command outside of python the file transfers just fine. Am I doing something incorrect here?
****EDIT****
I cant even get Subprocess to work with this which is why i ended up using na shell script. Here is my attempt with Subprocess:
import subprocess
subprocess.call("sshpass -p ######## scp test.txt pi#IP:/home/pi")
print"DONE"
Again I get no errors, but the file is not transferred
****EDIT #2****
So I found out that because sshpass is being used, scp isnt prompting me to add the IP to known hosts, as a result the file simply isnt trnasferred at all. I need a way to add this acceptance into the script IE I ge the following if I launch the command without sshpass:
The authenticity of host 'IP (IP)' can't be established.
ECDSA key fingerprint is 13:91:24:8e:6f:21:98:1f:5b:3a:c8:42:7a:88:e9:91.
Are you sure you want to continue connecting (yes/no)?
I want to communicate to pass "yes\n" to this prompt as well as the password afterwards. Is this possible?
For the first query
You can use 'subprocess.popen' to get output(STDOUT) and error(STDERR) for the executed command.
import subprocess
cmd = 'sshpass -p ****** scp dinesh.txt root#256.219.210.135:/root'
p = subprocess.Popen(cmd.split(), stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out, err = p.communicate()
print "Output is ",out
print "Error is ",err
If you execute above code with wrong password, the you will get below output:
[root#centos /]# python code.py
Output is
Error is Permission denied, please try again.
In this case, if the file is successfully transferred, then there is no output.
If you execute command like 'ls -l' then output will be printed.
For your second query (****EDIT #2****)
Options are :
Password less SSH. Check this.
Pexpect
I found a much easier way of tackling all of this
sshpass -p ###### scp -o StrictHostKeyChecking=no test.txt pi#IP:/home/pi
The -o switch allows me to auto store the IP into known hosts thus I do not need to communicate with the shell at all. The interaction from Python to Shell works with that addition; Doing this solely through subprocess also works.
If you don't mind to try other approaches it worth to use SCPClient from scp import.
I'm trying to run netsh command on remote windows hosts (windows domain environment with admin rights). The following code works fine on local host but I would like to run it on remote hosts as well using python.
import subprocess
netshcmd=subprocess.Popen('netsh advfirewall show rule name=\”all\”', shell=True, stderr=subprocess.PIPE, stdout=subprocess.PIPE )
output, errors = netshcmd.communicate()
The problem is that I'm no sure how/what method to use to initiate the connection to remote hosts and then run the subprocess commands. I cannot use ssh or pstools and would like try to implement it using existing pywin32 modules if possible.
I have used WMI module in a past which makes it very easy to query remote host but I couldn't find any way to query firewall policies over WMI and that's why using subprocess.
First you login the remote host machine using of pxssh modules Python: How can remote from my local pc to remoteA to remoteb to remote c using Paramiko
remote login of windows:
child = pexpect.spawn('ssh tiger#172.16.0.190 -p 8888')
child.logfile = open("/tmp/mylog", "w")
print child.before
child.expect('.*Are you sure you want to continue connecting (yes/no)?')
child.sendline("yes")
child.expect(".*assword:")
child.sendline("tiger\r")
child.expect('Press any key to continue...')
child.send('\r')
child.expect('C:\Users\.*>')
child.sendline('dir')
child.prompt('C:\Users\.*>')
Python - Pxssh - Getting an password refused error when trying to login to a remote server
and send your netsh command
I will recommend using Fabric, it's a powerful python tool with a suite of operations for executing local or remote shell commands, as well as auxiliary functionality such as prompting the running user for input, or aborting execution:
install fabric : pip install fabric
write the following script named remote_cmd.py:
"""
Usage:
python remote_cmd.py ip_address username password your_command
"""
from sys import argv
from fabric.api import run, env
def set_host_config(ip, user, password):
env.host_string = ip
env.user = user
env.password = password
def cmd(your_command):
"""
executes command remotely
"""
output = run(your_command)
return output
def main():
set_host_config(argv[1], argv[2], argv[3])
cmd(argv[4]))
if __name__ == '__main__':
main()
Usage:
python remote_cmd.py ip_address username password command
When trying to traverse a SOCKS5 proxy to a RHEL5 Linux host using Fabric 1.6, the command returns but no output is returned to the stdout.
$> fab -H myhost -f ./fabfile.py remote_test --show=debug
Using fabfile '/home/myuser/fabric/fabfile.py'
Commands to run: remote_test
Parallel tasks now using pool size of 1
[myhost] Executing task 'remote_test'
[myhost] run: echo testing
Enter SOCKS5 password for myuser:
[myhost] Login password for 'myuser':
$> echo $?
0
$>
The remote_test function is:
def remote_test():
run('echo testing')
If I run the command against a non SOCKS5 host it works fine.
I am running the latest builds, although I haven't to date gotten this to work:
Python 2.7.3
Paramiko == 1.10.0
pycrypto == 2.6
fabric == 1.6.0
RHEL5.9
openssh-4.3p2-82.el5
My ~/.ssh/config looks like the following:
Host *.domain
ProxyCommand connect -S socksproxy.domain:1080 %h %p
And using the connect binary built from http://www.meadowy.org/~gotoh/ssh/connect.c
I haven't got access to github from the Company network so I will ask there when I get a chance as well.
Has anyone got any ideas why this could be occuring?
Thanks
Matt
I use connect rather than fabric but the answer is surely the same. There is an explination in connect.c that the SOCKS5_PASSWORD, HTTP_PROXY_PASSWORD, and CONNECT_PASSWORD do what you want. I've a script called ssh-tbb that goes as follows.
#!/bin/bash
export CONNECT_PASSWORD=""
exec ssh -o ProxyCommand="connect -5 -S 127.0.0.1:9150 %h %p" $*
Ideally, one should call this script ssh-tor and detect if tor lives on port 9050 or 9150 of course.
I am writing a GUI which uses SSH commands. I tried to use the subprocess module to call ssh and set the SSH_ASKPASS environment variable so that my application can pop up a window asking for the SSH password. However I cannot make ssh read the password using the given SSH_ASKPASS command: it always prompts it in the terminal window, regardless how I set the DISPLAY, SSH_ASKPASS, TERM environment variables or how I pipe the standard input/output. How can I make sure that ssh is detached from the current TTY and use the given program to read password?
My test code was:
#!/usr/bin/env python
import os
import subprocess
env = dict(os.environ)
env['DISPLAY'] = ':9999' # Fake value (trying in OS X and Windows)
del env['TERM']
env['SSH_ASKPASS'] = '/opt/local/libexec/git-core/git-gui--askpass'
p = subprocess.Popen(['ssh', '-T', '-v', 'user#myhost.com'],
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
env=env
)
p.communicate()
SSH uses the SSH_ASKPASS variable only if the process is really detached from TTY (stdin redirecting and setting environment variables is not enough). To detach a process from console it should fork and call os.setsid(). So the first solution I found was:
# Detach process
pid = os.fork()
if pid == 0:
# Ensure that process is detached from TTY
os.setsid()
# call ssh from here
else:
print "Waiting for ssh (pid %d)" % pid
os.waitpid(pid, 0)
print "Done"
There is also an elegant way to do this using the subprocess module: in the preexec_fn argument we can pass a Python function that is called in the subprocess before executing the external command. So the solution for the question is one extra line:
env = {'SSH_ASKPASS':'/path/to/myprog', 'DISPLAY':':9999'}
p = subprocess.Popen(['ssh', '-T', '-v', 'user#myhost.com'],
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
env=env,
preexec_fn=os.setsid
)
Your problem is that SSH detects your TTY and talks to it directly (as is clearly stated in the man-page). You can try and run ssh without a terminal - the man page suggests it might be necessary to redirect stdin to /dev/null for ssh to think it has no terminal.
You can also use pexcept for this, it's known to work with SSH - example usage.
The Right Way (TM) to do what you're trying to do is either:
Use a library specifically for using SSH in python (for example twisted conch or paramiko)
Use public and private keys so that passwords will not be necessary
If you want a quick and dirty way of doing it for your own personal usage, you could enable passwordless login between these two machines by doing this in your terminal:
ssh-keygen -t rsa # generate a keypair (if you haven't done this already)
ssh-copy-id user#other_machine # copy your public key to the other machine
Then you can get ssh commands to go through (subprocess can't seem to accept ssh commands directly) by creating a script (remember to mark it executable, e.g. chmod 755 my_script.sh ) with the things you want, such as:
#!/bin/bash
ssh user#other_machine ls
and call it from your program:
import subprocess
response = subprocess.call("./my_script.sh")
print(response)
For production-use of apps that need to be deployed on other people's machines I'd go with abyx's approach of using an SSH library. Much simpler than messing with some environment variables.