Unable to truly background an SSH tunnel using Fabric and nohup - python

I can't seem to get Fabric to play nice with backgrounding a process that I've used nohup on. . . It should be possible, given various pieces of information, including here and here.
def test():
h = 'xxxxx.compute-1.amazonaws.com'
ports = [16646, 9090, 6666]
with settings(host_string = h):
tun_s = "ssh -o StrictHostKeyChecking=no -i ~/.ssh/kp.pem %s#%s " % (env.user, h)
for port in ports:
p_forward = "-L %d:localhost:%d" % (port, port)
tun_s = "%s %s" % (tun_s, p_forward)
tun_s = "%s -N" % tun_s
# create the tunnel. . .
print "creating tunnel %s" % tun_s
run("nohup '%s' >& /dev/null < /dev/null &" % tun_s)
print "fin"
Abbreviated output:
ubuntu#domU-xxx:~/deploy$ fab test
executing on tunnel ssh -o StrictHostKeyChecking=no -i ~/.ssh/kp.pem ubuntu#xxx -L 16646:localhost:16646 -L 9090:localhost:9090 -L 6666:localhost:6666 -N
[xxx.compute-1.amazonaws.com] run: nohup 'ssh -o StrictHostKeyChecking=no -i ~/.ssh/kp.pem ubuntu#xxx.compute-1.amazonaws.com -L 16646:localhost:16646 -L 9090:localhost:9090 -L 6666:localhost:6666 -N' >& /dev/null < /dev/null &
fin
Done.
Disconnecting from xxxx
I know there is no problem with the tunnel command per se because if I strip away the nohup stuff it works fine (but obviously Fabric hangs). I'm pretty sure that it's not properly getting detached and when the run function returns the tunnel process is immediately dying.
But why?
This also happens with a python command in another part of my code.

So, it seems after much wrangling that this is not possible for whatever reason with my setup (default Ubuntu installs on EC2 instances). I have no idea why and as it seems possible according to various sources.
I fixed my particular problem by using Paramiko in place of Fabric, for calls that need to be left running in the background. The following achieves this:
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
privkey = paramiko.RSAKey.from_private_key_file('xxx.pem')
ssh.connect('xxx.compute-1.amazonaws.com', username='ubuntu', pkey=privkey)
stdin, stdout, stderr = ssh.exec_command("nohup ssh -f -o StrictHostKeyChecking=no -i ~/.ssh/xxx.pem ubuntu#xxx.compute-1.amazonaws.com -L 16646:localhost:16646 -L -N >& /dev/null < /dev/null &")
ssh.close()

Related

Py "permission denied" when running bash script

i made a script that connect to a ssh server using paramiko, and after using some bash command stored inside a bash script copy and take some data files, after that using the command line i use another command to copy the datas from ssh server(SVN) loccally. all works, but when im running the py script its says permission denined
The error that im receiving when i`m using the script is:
The error its not about bash.py "permission denied, cause i did that echo "bash its works" to verify if the commands works inside the bash.
The py script:
import paramiko
import os
hostname = "LOGIN AND CONECTION WORKS!"
username = ""
password = ""
# initialize the SSH client
client = paramiko.SSHClient()
# add to known hosts
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
try:
client.connect(hostname=hostname, username=username, password=password)
print("Connection was established!")
except:
print("[!] Cannot connect to the SSH Server")
exit()
# read the BASH script content from the file
bash_script = open("Collect-Usage-SVN.sh").read()
# execute the BASH script
stdin, stdout, stderr = client.exec_command(bash_script)
# read the standard output and print it
print(stdout.read().decode())
# print errors if there are any
err = stderr.read().decode()
if err:
print(err)
# execute cmd command to copy files locally
os.chdir('D:\GIT-files\Automate-Stats\SVN_sample_files\svnrawdatas')
os.system("start cmd /K scp root#atpcnd6c:/data/audit/2022-07-08-* .")
client.close()
THIS IS THE BASH_SCRIPT.SH:
#!/usr/bin/env python
echo "Bash script its running"
CurrentDate=$(date +"%F")
RawDataFolder="/data/audit"
svnLOGFILE="/data/audit/log-svn-usage-data-collection.log"
#echo "Timestamp when starting the work $(date +"%D %T")" >> $svnLOGFILE
echo "Timestamp when starting the work $(date +"%F %T")" >> $svnLOGFILE
# Collect raw data
echo "Generating raw SVN usage data" >> $svnLOGFILE
cp -v /data/conf/mod_authrewrite.map $RawDataFolder/$CurrentDate-svnRawData-mod_authrewrite.map.txt >> $svnLOGFILE;
cp -v /data/conf/svn_authorization.conf $RawDataFolder/$CurrentDate-svnRawData-authorization.conf.txt >> $svnLOGFILE;
cut -d: -f1 /data/conf/localauthfile.htpasswd > $RawDataFolder/$CurrentDate-svnRawData-localauthfile.htpasswd.txt
cd /data/svn; ls -ltr /data/svn | du -h --max-depth=1 > /data/audit/2022-05-06-svnRawData-repositoriesSize.csv;
for repo in /data/svn/*; do echo $repo; svnlook date $repo; done > $RawDataFolder/$CurrentDate-svnRawData-repositoriesLastChangeDate.csv;
echo "Finished generating raw data" >> $svnLOGFILE
echo "Timestamp when work is finished $(date +"%D %T")" >> $svnLOGFILE
echo "Happy data analysis !" >> $svnLOGFILE
echo "***********************************************************************************" >> $svnLOGFILE
echo "/n" >> svnLOGFILE
on first line also i used also : #!/usr/bin/env
This is how i runn the script on the server directly on the bash shell command:
Run the SVN script like this
/data/audit/Collect-Usage-SVN.sh > /dev/null 2>&1
The below command is used in CMD terminal to copy the files from server on locall disk
Copy SVN and GIT raw files from SVN server to local Windows
scp root#user:/data/audit/2022-07-08-* .
(see example in Copy_from_SVN_server.jpeg)

Cannot execute shell command with redirected multiline input using Python Paramiko

I am using Paramiko to do the standard SSH into a box, run commands, and display the STDOUT to my terminal. Due to sudo rules, I SSH into a machine with my username and run sudo /bin/su -s /bin/bash - <diff user account>. In my script, I am passing the following command into Paramiko but the STDOUT does not show on my screen. I believe this is because the sudo command is opening a new shell and Paramiko is watching the STDOUT on the new shell.
The commands DO run as I have logged onto the box and see the command history. How do I get the STDOUT of the commands I am running to show on my terminal?
import paramiko
def sshCommand(hostname, port, username, command, key_filename='/home/<my username>/.ssh/id_rsa'):
sshClient = paramiko.SSHClient()
sshClient.set_missing_host_key_policy(paramiko.AutoAddPolicy())
sshClient.connect(hostname=hostname, port=port, username=username, key_filename=key_filename)
stdin, stdout, stderr = sshClient.exec_command(command, get_pty=False)
output = stdout.readlines()
print(output)
sshCommand('<servername>', 22, '<my username>', """sudo /bin/su -s /bin/bash - <diff username> << \'EOF\'
echo "Hello World"
EOF
""")
I do not think that OpenSSH SSH server can accept the << 'EOF' shell construct on the "exec" channel.
But this should work:
echo echo "Hello World" | sudo /bin/su -s /bin/bash - <diff username>

Unable to get the proper exit value of a python script, being run on a remote machine

I am new to programming. Please be kind. :)
I am trying to invoke a python script on a remote machine from a shell script on my local machine.
I knew that we can exit the python script with values ranging from 0 to 127. I am trying to exit the python script with a value of 3. I verified with a print and I see the exit value is proper on the remote machine.
But on my local machine, I always see the exit value of remote script as 0.
This is my shell script on the local machine.
sshpass -p password ssh -o StrictHostKeyChecking=no root#10.10.10.10 << EOF
cd /root/drm/myDir
./bla.py
echo $?
This is my python script on a remote machine:
import os
import sys
for curr_tc in range(1,10):
cmd = '..........'
os.system(cmd)
:
:
:
:
if 'PASSED' in lineList[-5]:
continue
else:
exit(curr_tc)
exit(0)
Please point my mistake. Thanks in advance.
The reason why this fails is that $? is expanded on the client side.
The best way to fix this is to not inspect the value on the server side at all, and instead let it get propagated to the client side:
sshpass -p password ssh -o StrictHostKeyChecking=no root#10.10.10.10 << EOF
cd /root/drm/myDir
./bla.py
EOF
echo "SSH relayed the exit code, look: $?"
This allows it to work with all forms of if statements, set -e, or other ways of inspecting exit codes on the client.
The alternative way is to make sure the $? is escaped by quoting the here document:
sshpass -p password ssh -o StrictHostKeyChecking=no root#10.10.10.10 << "EOF"
cd /root/drm/myDir
./bla.py
echo "The command exited with $?"
EOF
echo "SSH relayed the exit code of echo itself, check it out: $?"
This will print the exit code correctly, but ssh itself will always count it as a success because the last command, echo, successfully printed stuff.
You could rewrite without using EOF like so:
result=$(sshpass -p password ssh -o StrictHostKeyChecking=no root#10.10.10.10 'cd /root/drm/myDir;./bla.py;echo $?')
echo $result
This will run all the commands on the host including expansion of the $?. When using the EOF thing, the expansion happens on your local machine as that other guy noted. Tested on my machines with a simple bash script that just ran exit 3 and it worked.
You could then compare the result using a simple if statement like this:
if [ $result -eq 3 ]; then
echo "The process completed successfully."
else
echo "Process returned unexpected return code $result."
# Do whatever you need to if this happens.
fi

pass bash script args as named parameters to a command inside the script

I have a bash script that takes two parameters. Inside that script, I need to call ssh using a heredoc and call a method that expects the two arguments. For example:
ssh -o "IdentitiesOnly=yes" -t -i $key -l user localhost << 'ENDSSH'
/my_python_app.py -u -t tar -p $1 -f $2
ENDSSH
key is set by my script, I know that part is good.
However, my_python_app prints out args and it doesn't show any arguments for -p and -f
I would call my script like
my_script /tmp filename
I use argparse in my python app, but I am also printing out sys.argv and it gives me:
['my_python_app.py', '-u', '-t', 'tar', '-p', '-f']
Note there are no values received for -p and -f. (-u is a flag, and that is set correctly).
How do I pass $1 and $2 to my_python_app as the -p and -f values?
Remove the quotes around the here-document delimiter (i.e. use << ENDSSH instead of << 'ENDSSH'). The quotes tell the shell not to expand variable references (and some other things) in the here-document, so $1 and $2 are passed through to the remote shell... which doesn't have any parameters so it replaces them with nothing.
BTW, removing the single-quotes may not fully work, since if either argument contains whitespace or shell metacharacters, the remote end will parse those in a way you probably don't intend. As long as neither argument can contain a single-quote, you can use this:
ssh -o "IdentitiesOnly=yes" -t -i $key -l user localhost << ENDSSH
/my_python_app.py -u -t tar -p '$1' -f '$2'
ENDSSH
If either might contain single-quotes, it gets a little more complicated.
The more paranoid way to do this would be:
# store these in an array to reduce the incidental complexity below
ssh_args=( -o "IdentitiesOnly=yes" -t -i "$key" -l user )
posixQuote() {
python -c 'import sys, pipes; sys.stdout.write(pipes.quote(sys.argv[1])+"\n")' "$#"
}
ssh "${ssh_args[#]}" localhost "bash -s $(posixQuote "$1") $(posixQuote "$2")" << 'ENDSSH'
/path/to/my_python_app.py -u -t tar -p "$1" -f "$2"
ENDSSH
If you know with certainty that the destination account's shell matches the local one (bash if the local shell is bash, ksh if the local shell is ksh), consider the following instead:
printf -v remoteCmd '%q ' /path/to/my_python_app.py -u -t tar -p "$1" -f "$2"
ssh "${ssh_args[#]}" localhost "$remoteCmd"

Executing remote python script in background over SSH

I have a python file "run.py" like below on my remote server.
import subprocess
subprocess.Popen(["nohup", "python", "/home/admin/Packet/application.py", "&"])
I want to run that file from my local computer using SSH. I'm trying like the below. However, my local terminal got stuck there. It seems it isn't being run in the background.
ssh -n -f -i /Users/aws/aws.pem admin#hello_world.com 'python /home/admin/run.py'
After running that command, my terminal got stuck.
The following is an example I'm using, you can try something like this, customizing the ssh_options.
import subprocess
ssh_options = '-o ConnectTimeout=10 -o PasswordAuthentication=no -o PreferredAuthentications=publickey -o StrictHostKeyChecking=no'
server_name = 'remote_server.domain'
cmd = 'ssh ' + ssh_options + ' ' + server_name + ' "/usr/bin/nohup /usr/bin/python /home/admin/run.py 2>&1 &"'
p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
Later you can redirect the output to a flat file, changing :
2>&1 &
for:
>> /path/lo/log_file.txt 2>&1 &

Categories