Executing python autogui script on ssh remote executes commands on ssh host - python

I am trying to execute my python script over ssh on my Pi. I have setup my Priv/Pub Keys and execute them by using a python script on my host machine:
subprocess.run('ssh -p 2222 john#<IP_of_Pi> python3 /home/john/test.py', shell=True)
I am trying to automate some things by using autogui. My basic script looks like this
import pyautogui as aut
aut.PAUSE = 1
aut.keyDown('ctrl')
aut.keyDown('alt')
aut.press('t')
aut.keyUp('alt')
aut.keyUp('ctrl')
aut.typewrite(['f','i','r','e','f','o','x','enter'])
aut.hotkey('alt','d')
aut.typewrite(['d','u','c','k','d','u','c','k','g','o','.','c','o','m','enter'])
My problem here ist that all commands I execute are run on my local host machine. Same goes if I use other automation frameworks, e.g. Selenium. Every browser I open up is not started on my Pi, but on my host machine.
I have tried several modifications of subprocess, like
subprocess.run(['ssh','-p','2222 ','john#<IP_of_Pi>', 'python3', /home/john/test.py'])
subprocess.run(['ssh','-p','2222 ','john#<IP_of_Pi>', 'python3', /home/john/test.py'],shell=True,stdout=subprocess.PIPE)
subprocess.Popen(['ssh','-p','2222 ','john#<IP_of_Pi>', 'python3', /home/john/test.py'])
...
I can reprocude the errors if I execute test.py in my shell, the browser still opens on my host.
Just in case its relecant, the ssh_config on my remote (Pi):
Host *
# ForwardAgent no
# ForwardX11 no
# ForwardX11Trusted yes
# PasswordAuthentication yes
# HostbasedAuthentication no
# GSSAPIAuthentication no
# GSSAPIDelegateCredentials no
# GSSAPIKeyExchange no
# GSSAPITrustDNS no
# BatchMode no
# CheckHostIP yes
# AddressFamily any
# ConnectTimeout 0
# StrictHostKeyChecking ask
# IdentityFile ~/.ssh/id_rsa
# IdentityFile ~/.ssh/id_dsa
# IdentityFile ~/.ssh/id_ecdsa
# IdentityFile ~/.ssh/id_ed25519
# Port 22
# Protocol 2
# Ciphers aes128-ctr,aes192-ctr,aes256-ctr,aes128-cbc,3des-cbc
# MACs hmac-md5,hmac-sha1,umac-64#openssh.com
# EscapeChar ~
# Tunnel no
# TunnelDevice any:any
# PermitLocalCommand no
# VisualHostKey no
# ProxyCommand ssh -q -W %h:%p gateway.example.com
# RekeyLimit 1G 1h
SendEnv LANG LC_*
HashKnownHosts yes
GSSAPIAuthentication yes

Related

SSH tunnel with terminal

I'm a beginner at ssh so be kind with my limited knowedge ;)
What I want to do is as follow:
SSH to a PC and then from this PC SSH to another one, see picture below:
SSH Tunnel
Here are the commands I run when I do it manually:
ssh user#155.254.0.1
After this command I will be prompt to enter the password.
From here I ssh again to the next "PC" with the following command:
ssh root#190.22.0.1 -y
and then I get prompt to enter the password.
I tried to use a python script to do it automatically by I was not able to come to the next seconds step.
Here is how the python code looks like:
import subprocess
cmd_1 = ["ls"]
cmd_3 = ['ls', '-l']
def send_top_cmd():
cmd_2 = ['top', "-b", "-n", "5"]
com2 = subprocess.Popen(cmd_2, stdout=out)
com2.wait()
def send_ssh_pc_1():
cmd = ["sshpass", "-p", "'user'", "ssh", "swupdate#155.254.0.1"]
ssh_sga = subprocess.Popen(cmd, stdout=out)
ssh_sga.wait()
def send_ssh_pc_2():
cmd = ["sshpass", "-p", "'root'", "ssh", "root#190.22.0.1"]
ssh_hpa = subprocess.Popen(cmd, stdout=out)
ssh_hpa.wait()
def send_exit():
cmd = ["exit"]
process = subprocess.Popen(cmd, stdout=out)
cmd = ["exit"]
process = subprocess.Popen(cmd, stdout=out)
print("done")
with open('output.txt', 'w') as out:
send_ssh_pc_1() # ssh PC 1
send_ssh_pc_2() # ssh PC 2
send_top_cmd() # Send a simply command
send_exit()
The script fails at the "send_ssh_pc_2()" since I dont have sshpass installed and there's no possibility to install it there :(
Is there a easier way to do it automatically?
So much easier to write as an answer instead of comment.
First, enable RSA authentication for both of your SSH boxes. Then you don't need to worry about passing password. https://www.ssh.com/academy/ssh/public-key-authentication
Then open SSH tunnel from your computer with following command:
ssh -L 2222:190.22.0.1:22 user#155.254.0.1
That will enable tunnel from your local computer port 2222 to host in address 190.22.0.1 port 22. So next you can open SSH connection to the target computer like this.
ssh -p 2222 root#localhost
If your RSA private key is authorized to both user#155.254.0.1 and root#190.22.0.1 no passwords should be asked and you have SSH connection to 192.22.0.1 from your workstation.
Of course you can tunnel any TCP traffic, not just SSH.
*** ADDED ***
Here is example of content of authorized_keys -file (some content removed).
ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA3fauf5H3kN92Gxx8xerCF***********************************************************************************************************************PPIrUMdf1948pqLspom8SIyeqJeKX8wVqcJch35O0Q4UVlbw== user#host
ssh-rsa AAAAB3Nzaasdfrgaa4634w4gfdewrtfauf5H3kN92Gxx8xerCF***********************************************************************************************************************PPIrUMdf1948pqLspossdfgqrbbsrdtwetdsfgsfdgsd== admin#anotherhost

Issue with capturing Password: prompt on remote device

Fabric not seeing the Password: prompt for remote device (Aruba Mobility Master) over SSH.
The script I wrote is using fabric2 and Python3 to login to a remote network device, and running a command to SCP a file from this device to another device. After running the SCP command the device asks for password. This prompt is visible when running over a normal SSH client, but not visible when running with fabric.
I have tested with pty=False and pty=True
The script doesn't hang like its waiting for input either. It just continues and the SCP fails with incorrect password.
The password prompt is a bit special in that it will echo back characters with stars (*) instead of not echoing anything at all.
The network device does not provide a normal bash shell. Instead its a vendor specific shell (Aruba/HPE). The device is "Aruba Mobility Master"
from invoke import Responder
scppass = Responder(
pattern=r'Password:',
response='MyPassword\n',
)
connect_kwargs = {"password": "LoginPassword"}
c = Connection(host="1.2.3.4", user="username", connect_kwargs=connect_kwargs)
# Have tried with pty=False and pty=True
c.run("copy flash: configbackup.tar.gz scp: 2.3.4.5 username /PATH/configbackup.tar.gz", pty=True, watchers=[scppass])
This is how it looks when run from an interactive SSH session
Password:*********************
Secure file copy:
Press 'q' to abort.
....
File uploaded successfully
This is the output from fabric
Secure file copy:
Press 'q' to abort.
............
Error copying file:
Permission denied: wrong username or password
Also tried with
ssh username#1.2.3.4 "copy flash: nothng scp: 2.3.4.5 user /something/asd" >out 2>err
ssh -t username#1.2.3.4 "copy flash: nothng scp: 2.3.4.5 user /something/asd" >out 2>err
Neither of these capture the "Password:" either in either the stdout or stderr file.

mount via paramiko fails "No such file or directory"

i am using paramiko of python to manipulate access remote linux machine. My command "mount device dir" is failing with " No such file or directory", even though exact the same command succeeds once i use it remotely (connected via ssh, not via paramiko).
I have tried to vary /etc/fstab to some values, again, same situation. Once i type it via ssh - ok, the same command via paramiko - above error message.
Any ideas?
example on command (changed minimally from origin):
import paramiko
self.ssh = paramiko.SSHClient()
self.ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
self.ssh.connect('192.168.1.1', username='root', password='passwd')
stdin, stdout, stderr = self.ssh.exec_command("/bin/mount /dev/sda1")
gives me an error:
mount /dev/sda1 failed: mount: mounting /dev/sda1 on /media/card failed: No such file or directory
contents from /etc/fstab:
/dev/sda1 /media/card vfat fmask=0000,dmask=0000 0 0
of course, /media/card directory exists. again, i can use above command manually via ssh and it works as expected.
update.
meanwhile i tried fabric library of python (built on paramiko), exactly as described in Python - How do I authenticate SSH connection with Fabric module?
c = fabric.Connection(host = '192.168.1.1', user = "root", connect_kwargs={'password': 'passwd'})
c.run("/bin/mount /dev/sda1")
giving me exactly the same error message as with paramiko directly.
update2. well, as a matter of working around, i mounting drive using direct ssh call, as suggested below in comments. after i do in code whatever necessary, i try to unmount drive using "normal" paramiko call:
self.ssh.exec_command("/bin/umount /dev/sda1")
and it works. so now i am completely lost, mount as above is failing, but unmount is working. this is real strange..
update3. i have tried to extra set LD_LIBRARY_PATH to location of mount's libraries, it needs both libm.so.6 and libc.so.6, both located in /lib like:
self.ssh.exec_command("export LD_LIBRARY_PATH=/lib:/usr/lib && /bin/mount /dev/sda1")
yet no success again.
I was able to get this to work (first draft. Also, I am new to python). Anyway, here is a snip of my code.
The biggest hang-up for me was that it seems as though there is a 4->1 requirement for back slashes in the windows hostname.
Make sure you have a share from the windows PC first. My computer/share name in this case is "COMP_NAME/SHARE_NAME"
The username/password provided are your window creds for accessing the share.
import sys
import paramiko
import constant
### START ###############################################################################
# connect to a GW device
# GW: hostname to connect to
# return: client connection object
def connectToClient(GW):
try:
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(GW, username=constant.GW_USER, password=constant.GW_PASS)
except:
print("Unexpected error:", sys.exc_info()[0])
return None
return client
### END ################################################################################
### START ###############################################################################
# execute a command on the remote device
# client: client connection object to the GW
# cmd: the command to execute
# eg. 'ls -l'
# return: nothing (TODO: maybe return error info)
def exec(client, cmd):
stdin, stdout, stderr = client.exec_command(cmd)
for line in stdout:
print(line.strip('\n'))
#for line in stderr:
# print(line.strip('\n'))
return
### END #################################################################################
# other stuff
# .
# .
# .
##########################################
# Start - upload the self extracting file to the GW
##########################################
#create the mount point
exec(client, "sudo mkdir /mnt/remote_files")
#mount the source directory (4 to 1 for the back slash chars in the UNC address ...)
exec(client, "sudo mount -t cifs -o username=oxxxxxxp,password=cxxxxxxxxx0 \\\\\\\\COMP_NAME\\\\SHARE_NAME /mnt/remote_files")
#copy the script file
exec(client, "cp /mnt/remote_files/selfextract.bsx rtls/scripts/selfextract.bsx")
#unmount the remote source
exec(client, "sudo umount /mnt/remote_files")
##########################################
# Done - upload the self extracting file to the GW
##########################################
# other stuff
# .
# .
# .
Hope this helps someone..
Pat

python 3 paramiko ssh agent forward over jump host with remote command on third host

Moin!
Situation: connect to the destination.host over the jump.host and run a command on the destination.host, which connects in the background to the another.host (on this host my ssh key is needed).
Scheme: client --> jump.host --> destination.host --- remote_command with ssh key needed on the other host --> another.host
#!/usr/bin/python
import paramiko
jumpHost=paramiko.SSHClient()
sshKey = paramiko.RSAKey.from_private_key_file('path.to.key/file', password = 'the.passphrase')
jumpHost.set_missing_host_key_policy(paramiko.AutoAddPolicy())
jumpHost.connect('jump.hostname',username='foo', pkey = sshKey)
jumpHostTransport = jumpHost.get_transport()
dest_addr = ('destination.hostname', 22)
local_addr = ('jump.hostname', 22)
jumpHostChannel = jumpHostTransport.open_channel("direct-tcpip", dest_addr, local_addr)
destHost=paramiko.SSHClient()
destHost.set_missing_host_key_policy(paramiko.AutoAddPolicy())
destHost.connect('destination.hostname', username='foo', sock=jumpHostChannel, pkey=sshKey)
destHostAgentSession = destHost.get_transport().open_session()
paramiko.agent.AgentRequestHandler(destHostAgentSession)
stdin, stderr, stdout = destHost.exec_command("my.command.which.connects.to.another.host")
print(stdout.read())
print(stderr.read())
destHost.close()
jumpHost.close()
The above code works well, if run "local" commands on the destination.host - e.g. uname, whoami, hostname, ls and so on... But if i run a command, which connects in the background to another host where my ssh key is needed, the code raised in the error:
raise AuthenticationException("Unable to connect to SSH agent")
paramiko.ssh_exception.AuthenticationException: Unable to connect to SSH agent
If i connect via Putty at the same chain, it works well.
Can anyone give me a hint to resolve my problem?
Thx in advance.
Assumption: Your keys work across jump host and destination host.
Creating a local agent in that case will work. You could manually create it via shell first and test it via iPython.
eval `ssh-agent`; ssh-add <my-key-file-path>
Programmatically this can be done -
# Using shell=True is not a great idea because it is a security risk.
# Refer this post - https://security.openstack.org/guidelines/dg_avoid-shell-true.html
subprocess.check_output("eval `ssh-agent`; ssh-add <my-key-file-path>", shell=True)
I am trying to do something similar and came across this post, I will update if I find a better solution.
EDIT: I have posted the implementation over here - https://adikrishnan.in/2018/10/25/agent-forwarding-with-paramiko/

Paramiko: Port Forwarding Around A NAT Router

Configuration
LOCAL: A local machine that will create an ssh connection and issue commands on a REMOTE box.
PROXY: An EC-2 instance with ssh access to both LOCAL and REMOTE.
REMOTE: A remote machine sitting behind a NAT Router (inaccessible by LOCAL, but will open a connection to PROXY and allow LOCAL to tunnel to it).
Port Forwarding Steps (via command line)
Create an ssh connection from REMOTE to PROXY to forward ssh traffic on port 22 on the REMOTE machine to port 8000 on the PROXY server.
# Run from the REMOTE machine
ssh -N -R 0.0.0.0:8000:localhost:22 PROXY_USER#PROXY_HOSTNAME
Create an ssh tunnel from LOCAL to PROXY and forward ssh traffic from LOCAL:1234 to PROXY:8000 (which then forwards to REMOTE:22).
# Run from LOCAL machine
ssh -L 1234:localhost:8000 PROXY_USER#PROXY_HOSTNAME
Create the forwarded ssh connection from LOCAL to REMOTE (via PROXY).
# Run from LOCAL machine in a new terminal window
ssh -p 1234 REMOTE_USER#localhost
# I have now ssh'd to the REMOTE box and can run commands
Paramiko Research
I have looked at a handful of questions related to port forwarding using Paramiko, but they don't seem to address this specific situation.
My Question
How can I use Paramiko to run steps 2 and 3 above? I essentially would like to run:
import paramiko
# Create the tunnel connection
tunnel_cli = paramiko.SSHClient()
tunnel_cli.connect(PROXY_HOSTNAME, PROXY_PORT, PROXY_USER)
# Create the forwarded connection and issue commands from LOCAL on the REMOTE box
fwd_cli = paramiko.SSHClient()
fwd_cli.connect('localhost', LOCAL_PORT, REMOTE_USER)
fwd_cli.exec_command('pwd')
A detailed explanation of what Paramiko is doing "under the hood" can be found at #bitprohet's blog here.
Assuming the configuration above, the code I have working looks something like this:
from paramiko import SSHClient
# Set up the proxy (forwarding server) credentials
proxy_hostname = 'your.proxy.hostname'
proxy_username = 'proxy-username'
proxy_port = 22
# Instantiate a client and connect to the proxy server
proxy_client = SSHClient()
proxy_client.load_host_keys('~/.ssh/known_hosts/')
proxy_client.connect(
proxy_hostname,
port=proxy_port,
username=proxy_username,
key_filename='/path/to/your/private/key/'
)
# Get the client's transport and open a `direct-tcpip` channel passing
# the destination hostname:port and the local hostname:port
transport = proxy_client.get_transport()
dest_addr = ('0.0.0.0', 8000)
local_addr = ('127.0.0.1', 1234)
channel = transport.open_channel("direct-tcpip", dest_addr, local_addr)
# Create a NEW client and pass this channel to it as the `sock` (along with
# whatever credentials you need to auth into your REMOTE box
remote_client = SSHClient()
remote_client.load_host_keys(hosts_file)
remote_client.connect('localhost', port=1234, username='remote_username', sock=channel)
# `remote_client` should now be able to issue commands to the REMOTE box
remote_client.exec_command('pwd')
Is the point solely to bounce SSH commands off PROXY or do you need to forward other, non SSH ports too?
If you just need to SSH into the REMOTE box, Paramiko supports both SSH-level gatewaying (tells the PROXY sshd to open a connection to REMOTE and forward SSH traffic on LOCAL's behalf) and ProxyCommand support (forwards all SSH traffic through a local command, which could be anything capable of talking to the remote box).
Sounds like you want the former to me, since PROXY clearly already has an sshd running. If you check out a copy of Fabric and search around for 'gateway' you will find pointers to how Fabric uses Paramiko's gateway support (I don't have time to dig up the specific spots myself right now.)

Categories