Is it possible to stream the output of a program that is being executed via ssh?
Example program (test.py on remote):
import time
while True:
print time.time()
time.sleep(1)
Command (local):
ssh name#remote 'python test.py'
Since the program never terminates, the output is not streamed; is this possible in some way?
Apparently, adding the -t option to the ssh command works. It flushes the stdout:
ssh -t name#remote 'python test.py'
Related
I am trying to run my local bash script on remote server without copying it into remote server. It is as simple as following for test purpose. There are more than a few servers where it runs perfectly, but in some server running tcsh, there is an issue. How do I invoke bash, if following does not work. Below is dummy test.sh
#!/bin/bash
a=test
echo $a
echo $SHELL
I am using Python Paramiko exec_command for remote execution as following:
my_script = open("test.sh").read()
stdin, stdout, stderr = ssh.exec_command(my_script, timeout=15)
print(stdout.read().decode())
err = stderr.read().decode()
if err:
print(err)
Given, that connection works and same script works for other servers with bash default shell.
This is the output that i get:
/bin/tcsh
printing from errors
a=test: Command not found.
a: Undefined variable.
The #!/bin/bash is a comment. Sending it to a remote shell as a command has no effect.
You have to execute /bin/bash on the server and send your script to it:
stdin, stdout, stderr = ssh.exec_command("/bin/bash", timeout=15)
stdin.write(my_script)
Also, you have to exit the shell at the end of your script, otherwise it will never end.
Related question:
Pass arguments to a bash script stored locally and needs to be executed on a remote machine using Python Paramiko
Consider this python script:
import subprocess
nc = subprocess.Popen(["/bin/bash"], stdin=subprocess.PIPE, text=True)
nc.stdin.write("nc localhost 2222\n")
nc.stdin.write("pwd\n")
When I listen with netcat as nc -lnvp 2222
I successfully connect and send the string pwd nothing more happens of course.
Now I get a non stable php reverse shell(Completely new event) and I connect through netcat successfully. I execute this script to upgrade shell and print current directory. By the way that listener is another Popen instance.
import subprocess
nc = subprocess.Popen(["/bin/bash"], stdin=subprocess.PIPE, text=True)
nc.stdin.write("nc localhost 2222\n")
nc.stdin.write('python3 -c "import pty;pty.spawn(\'/bin/bash\')"\n')
nc.stdin.write('pwd\n')
Now when I execute that python script, I expected the input will go through netcat, get executed in that new bash tty and spawn a stable shell and pass pwd to return current directory. But this script only works upto spawing stable shell and then stdin input doesn't go through nc or something else happens that I'm not aware of.
What's happening here?
Edit: I need to be able to run multiple commands. Using subprocess.communicate(input=<command>) causes deadlock and can't accept stdin.
Overview
I'm trying to use python fabric to run an ssh command as root on a remote server.
The command: nohup ./foo &
foo is expected to command run for several days. I must be able to disassociate foo from fabric's remote ssh session, and put foo in the background.
The Fabric FAQ says you should use something like screen or tmux when you run your fabric script (which runs the backgrounded command). I tried that, but my fabric script still hung. foo is not hanging.
Question
How do I use fabric to run this command on a remote server without the script hanging: nohup ./foo &
Details
This is my script:
#!/bin/sh
# Credit: https://unix.stackexchange.com/a/20895/6766
if "true" : '''\'
then
exec "/nfs/it/network_python/$OSREL/bin/python" "$0" "$#"
exit 127
fi
'''
from getpass import getpass
import os
from fabric import Connection, Config
assert os.geteuid()==0, "ERROR: Must run as root"
for host in ['host1.foo.local', 'host2.foo.local']:
# Make an ssh connection to the host...
conn = Connection(host)
# The script always hangs at this line
result = conn.run('nohup ./foo &', warn=True, hide=True)
I always open a tmux session to run the aforementioned script in; even doing so, the script hangs when I get to conn.run(), above.
I'm running the script on a vanilla CentOS 6.5 VM; it runs under python 2.7.10 and fabric 2.1.
The Fabric FAQ is unclear... I thought the FAQ wanted tmux used on the local side when I executed the Fabric script.
The correct way to fix this problem is to replace nohup in the remote command, with screen -d -m <command>. Now I can run the whole script locally with no hangs (and I don't have to use tmux in the local term).
Explicitly, I have to rewrite the last line of my script in my question as:
# Remove &, and nohup...
result = conn.run('screen -d -m ./foo', warn=True, hide=True)
I have two Raspberry Pi's. I am trying to transfer files from one Pi to the other using scp. I am trying to do this through Python because the program that will be transferring files is a python file.
below is the shell script I have for the SCP part (Blurred out the pass and IP):
#!/bin/sh
sshpass -p ######## scp test.txt pi#IP:/home/pi
and below is the Python Script that launches that Shell script.
import subprocess
subprocess.call(['./ssh.sh'])
print("DONE")
For some reason the python script doesnt kick back any errors and hits the print line but the file is not transferred. When i run the scp command outside of python the file transfers just fine. Am I doing something incorrect here?
****EDIT****
I cant even get Subprocess to work with this which is why i ended up using na shell script. Here is my attempt with Subprocess:
import subprocess
subprocess.call("sshpass -p ######## scp test.txt pi#IP:/home/pi")
print"DONE"
Again I get no errors, but the file is not transferred
****EDIT #2****
So I found out that because sshpass is being used, scp isnt prompting me to add the IP to known hosts, as a result the file simply isnt trnasferred at all. I need a way to add this acceptance into the script IE I ge the following if I launch the command without sshpass:
The authenticity of host 'IP (IP)' can't be established.
ECDSA key fingerprint is 13:91:24:8e:6f:21:98:1f:5b:3a:c8:42:7a:88:e9:91.
Are you sure you want to continue connecting (yes/no)?
I want to communicate to pass "yes\n" to this prompt as well as the password afterwards. Is this possible?
For the first query
You can use 'subprocess.popen' to get output(STDOUT) and error(STDERR) for the executed command.
import subprocess
cmd = 'sshpass -p ****** scp dinesh.txt root#256.219.210.135:/root'
p = subprocess.Popen(cmd.split(), stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out, err = p.communicate()
print "Output is ",out
print "Error is ",err
If you execute above code with wrong password, the you will get below output:
[root#centos /]# python code.py
Output is
Error is Permission denied, please try again.
In this case, if the file is successfully transferred, then there is no output.
If you execute command like 'ls -l' then output will be printed.
For your second query (****EDIT #2****)
Options are :
Password less SSH. Check this.
Pexpect
I found a much easier way of tackling all of this
sshpass -p ###### scp -o StrictHostKeyChecking=no test.txt pi#IP:/home/pi
The -o switch allows me to auto store the IP into known hosts thus I do not need to communicate with the shell at all. The interaction from Python to Shell works with that addition; Doing this solely through subprocess also works.
If you don't mind to try other approaches it worth to use SCPClient from scp import.
Trying to monitor the available physical disc space of a remote machine using a python script, which executes the df -h . command using subprocess.popen.
import subprocess
import time
command = 'ssh remoteserver "df -h ."'
while True:
proc = subprocess.Popen(command,shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
output,err=proc.communicate()
print output
print err
time.sleep(60)
The script runs fine and prints the output to the terminal when run from command line
$> python2.7 script.py
Filesystem Size Used Avail Use% Mounted on
remoteserver:/home/user
555G 447G 109G 81% /home
The scripts does not produce any output or seems to be blocking when the script is started with nohup command.
$> nohup python2.7 script.py &
Would like the script to work and fetch the disc space of remote machine using the above script when started in nohup.
I'm not 100% sure of the underlying issue here, but when you invoke NOHUP in the shell, it's disconnected some of the STDIN/STDOUT from the terminal process, which I suspect it causing some of this interactions you're seeing.
Given that you're doing this from a remote machine, I'd actually recommend you look at using something like Fabric as a library to do what you're after. It's pretty straightforward, and does most of the handling of terminal sessions as well as closing things down nicely for you when you're complete.
something like:
from fabric import api
from fabric.api import env
import fabric
env.host_string = '%s#%s' % (username, remote_host)
env.disable_known_hosts = True
env.password = password
fabric.state.output['stdout'] = False
fabric.state.output['stderr'] = False
results = api.run('df -h')
You might try sending stdin=subprocess.PIPE to the subprocess command, then calling proc.stdin.close() on the next line, before the communicate() call. Or you can try changing the command to 'ssh remoteserver "df -h ." </dev/null'. Others report using FNULL = open(os.devnull, 'r') and passing in FNULL to the stdin= argument, but I'm not sure if you need to call FNULL.close() after or not.
SSH is most likely waiting for input for some reason when it is run from nohup. Perhaps it is unable to authenticate in the nohup environment and is asking for password input?
To make sure SSH is not waiting for input, try adding -o "BatchMode yes" to the ssh command and see if there are some clues in the output/error from the subprocess communicate call.