Python - Paramiko isn't running a remote script properly - python

I am trying to run a script over SSH on a remote computer (the script is located on the remote computer). However, when I run Paramiko, all I'm doing is this:
ssh = paramiko.SSHClient()
ssh.connect(-----blacked out-----)
ssh.exec_command("python script.py")
But it's not even executing the command. The script just runs a couple command line commands. The script.py file works just fine if I run it on the remote computer through the remote computer's terminal, but it won't when I try to use ssh to do it like above with paramiko.

You might need to pass the full path to python and/or to the script, sometimes when not executing in terminal/interactive (tty) mode the path is not found as it does not load the profile scripts you would load during an interactive shell.

Without any info I'd guess that it outputs info which you don't read so it blocks and waits until you do... It's like echoing into the pipe when there is nothing on the other side...
I'd recommend looking into http://stackoverflow.com/a/32758464

I had faced a similar problem. In my case, I was executing another process from the ps1 file and the ps1 file was given in the ssh.exec_command() function.
my code flow was
ssh = paramiko.SSHClient()
ssh.connect(Host, Port, Username, Passsword)
ssh.exec_command(run.ps1)
# ----------------------
# As per the Expectation :
# -> run.ps1 will execute on remote
# -> run.ps1 contains "start-process notepad.exe"
# -> So, it should spawn a fresh notepad process
# ----------------------
But, notepad.exe was not started on the remote system
I made the following changes referring to other solutions :
Converted all single file paths to absolute paths.
Added a wait in the run.ps1 file until the child completes its execution
Passed argument in exec_command as "powershell.exe -File Absolute/path/of/file.ps1"
maintained log file into paramiko code as paramiko.util.log_to_file('sssh.log')
This time I was able to see that Notepad.exe was running in the background
I hope it will help with the above question

Related

Using the ssh agent inside a python script

I'm pulling and pushing to a github repository with a python script. For the github repository, I need to use a ssh key.
If I do this manually before running the script:
eval $(ssh-agent -s)
ssh-add ~/.ssh/myprivkey
everything works fine, the script works. But, after a while, apparently the key expires, and I have to run those 2 cmds again
The thing is, if I do that inside the python script, with os.system(cmd), it doesn't work, it only works if I do that manually
I know this must be a messy way to use the ssh agent, but I honestly don't know how it works, and I just want the script to work, that's all
The script runs once an hour, just in case
While the normal approach would be to run your Python script in a shell where the ssh agent is already running, you can also consider an alternative approach with sshify.py
# This utility will execute the given command (by default, your shell)
# in a subshell, with an ssh-agent process running and your
# private key added to it. When the subshell exits, the ssh-agent
# process is killed.
Consider defining the ssh key path against a host of github.com in your ssh config file as outlined here: https://stackoverflow.com/a/65791491/14648336
If Linux then at this path ~/.ssh/ create a file called config and input something similar to in the above answer:
Host github.com
HostName github.com
User your_user_name
IdentityFile ~/.ssh/your_ssh_priv_key_file_name
This would save the need for starting an agent each time and also prevent the need for custom environment variables if using GitPython (you mention using Python) as referenced in some other SO answers.

ssh running a command gives different results to running it locally

I have a python script that uses Popen to create an appium server for a simulator on a mac
self.appium_process = subprocess.Popen(["/usr/local/bin/appium", "-a", self.ip, "--nodeconfig", self.node_file_path, "--relaxed-security", "-p", str(appium_port), "-dc", default_capabilities], stdout=log_file, stderr=subprocess.STDOUT)
I created a bash shell script that calls the python script. When I run the script from the local box it works and the appium logs show the connection.
I need to run this remote via ssh however. So I use the following to call the script:
ssh 10.18.66.99 automation_fw/config/testscript.sh
This, however, always ends up with the log showing:
env: node: No such file or directory
I checked and the node app has an extra slash before its called:
$ which node
/usr/local/bin//node
$
I tried changing the path on the machine but no change. How can I get this to run from ssh in the same way as it can run locally on that same box
A
When you are running a command vis SSH you are not starting what's called a login shell (more about that here).
From the details you've shared, I would say it's some thing in your environment (running outside a logged-in shell), more specifically a problem with your $PATH variable. You might want to check /etc/environment or similar paths (depending on your Linux flavour) for the wrong value.

How to run a python script from local machine on a remote server and display the outputs live?

The only solution i found was this :
cat MyScript.py | ssh username#ip_addr python -
the problem with this is that it wont show the outputs of that script live, it waits until the program is finished and then it displays the output
I'm using scapy with in this script and sniff packets with it, and i have to run this on that remote server (and no, i cant copy it there or write it there)
so what is the solution? how can i view the outputs of a script live in my command line?
I'm using windows 10.
Important note: also i think ssh is using some sort of buffering, and sends the output after the amount of printed stuff gets more than buffer, since when the output is very large it does show part of it suddenly, i want the output of that function to come to my computer as soon as possible, not after reaching a buffer or something
You should first send the file to your remote machine using
scp MyScript.py username#ip_addre:/path/to/script
then SSH to your remote machine using
ssh username#ip_addr
ANd finally, you run you script normally
python path/to/MyScript.py
EDIT
To execute your script directly without copying it to the remote machine, use this command:
ssh user#ip_addr 'python -s' < script.py

ssh session as python subprocess takes input but does not print it to stdout

I'm trying use python's cmd library to create a shell with limited commands. One requirement I have is to be able to run a command that executes an existing shell script which opens an ssh session on a remote machine and from there allows the user to interact with the remote shell as if it was a regular ssh session.
Simply using subprocess.Popen('[/path/to/connect.sh]') works well at least as a starting point except for one issue. You can interact with the remote shell but the input that you type is not shown on stdout...so for example you see the prompt on your stdout but when you type 'ls' you don't see it being typed but when you hit return it works as expected.
I'm trying to wrap my head around how to print the input to stdout and still send it along to the remote ssh session.
EDIT:
Actual code without using cmd was just the one line:
ssh_session = subprocess.Popen(['connect.sh'])
it was fired from a do_* method in a class which extended cmd.Cmd. I think I may end up using paramiko but would still be interested in anyone's input on this.
Assuming you are using a Unix like system, SSH detects if you are on a terminal or not. When it detects that you are not on a terminal, like when using subprocess, it will not echo the characters typed. Instead you might want to use a pseudo-terminal, see pexpect, or pty. This way you can get the output from SSH as if it was running on a true terminal.

Calling a a remote interactive python script from a bash script: Input request lag

I wrote a bash script that at one point automates the installation of some software on a remote host, like so:
ssh user#remotehost "<path>/install-script"
Where install-script is a bash script. That bash script at some point calls another bash script, which at some point calls an interactive python script, which then uses python's raw_input() function to gather user input.
When I run the install script normally (from a bash shell), it prompts for and accepts input perfectly fine. However, when the above piece of code from my script runs, I get no prompt until after I type the input.
The only script I really have control over is my automation script.
I have read this question: "Python - how can I read stdin from shell, and send stdout to shell and file." However, I have no problem running in a normal bash shell, only via a remote command over ssh.
Can this issue be fixed from within my script (and if so, how?), or would I have to modify the python script?
UPDATE
To clarify, the input prompts I am referring to are the prompts from the python script. I am not entering a password for ssh (the remote host has my public key in it's authorized_keys file).
UPDATE
For further clarification, my script (bash) is calling the install script (bash) that calls another bash script that finally calls the python script, which prompts for user input.
i.e. bash -> bash -> bash -> python
ssh user#remotehost "<path>/install-script"
When you run ssh and specify a command to invoke on the remote system, by default ssh will not allocate a PTY (pseudo-TTY) for the remote session. This means that you will communicate with the remote process through a set of pipes rather than a TTY.
When writing to a pipe, unix programs will typically buffer their write operations. You won't see output written to a pipe until the process writing to the pipe flushes its buffer, or when it exits--because buffered output is normally flushed on exit. Beyond that, a process can detect whether it's writing to a file, pipe, or tty, and it may adjust its behavior. For example, a shell like bash won't print command prompts when reading commands from a pipe.
You can force ssh to request a TTY for the session using the -t option:
ssh -tt user#remotehost "<path>/install-script"
Refer to the ssh man page for details.

Categories