Py "permission denied" when running bash script - python

i made a script that connect to a ssh server using paramiko, and after using some bash command stored inside a bash script copy and take some data files, after that using the command line i use another command to copy the datas from ssh server(SVN) loccally. all works, but when im running the py script its says permission denined
The error that im receiving when i`m using the script is:
The error its not about bash.py "permission denied, cause i did that echo "bash its works" to verify if the commands works inside the bash.
The py script:
import paramiko
import os
hostname = "LOGIN AND CONECTION WORKS!"
username = ""
password = ""
# initialize the SSH client
client = paramiko.SSHClient()
# add to known hosts
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
try:
client.connect(hostname=hostname, username=username, password=password)
print("Connection was established!")
except:
print("[!] Cannot connect to the SSH Server")
exit()
# read the BASH script content from the file
bash_script = open("Collect-Usage-SVN.sh").read()
# execute the BASH script
stdin, stdout, stderr = client.exec_command(bash_script)
# read the standard output and print it
print(stdout.read().decode())
# print errors if there are any
err = stderr.read().decode()
if err:
print(err)
# execute cmd command to copy files locally
os.chdir('D:\GIT-files\Automate-Stats\SVN_sample_files\svnrawdatas')
os.system("start cmd /K scp root#atpcnd6c:/data/audit/2022-07-08-* .")
client.close()
THIS IS THE BASH_SCRIPT.SH:
#!/usr/bin/env python
echo "Bash script its running"
CurrentDate=$(date +"%F")
RawDataFolder="/data/audit"
svnLOGFILE="/data/audit/log-svn-usage-data-collection.log"
#echo "Timestamp when starting the work $(date +"%D %T")" >> $svnLOGFILE
echo "Timestamp when starting the work $(date +"%F %T")" >> $svnLOGFILE
# Collect raw data
echo "Generating raw SVN usage data" >> $svnLOGFILE
cp -v /data/conf/mod_authrewrite.map $RawDataFolder/$CurrentDate-svnRawData-mod_authrewrite.map.txt >> $svnLOGFILE;
cp -v /data/conf/svn_authorization.conf $RawDataFolder/$CurrentDate-svnRawData-authorization.conf.txt >> $svnLOGFILE;
cut -d: -f1 /data/conf/localauthfile.htpasswd > $RawDataFolder/$CurrentDate-svnRawData-localauthfile.htpasswd.txt
cd /data/svn; ls -ltr /data/svn | du -h --max-depth=1 > /data/audit/2022-05-06-svnRawData-repositoriesSize.csv;
for repo in /data/svn/*; do echo $repo; svnlook date $repo; done > $RawDataFolder/$CurrentDate-svnRawData-repositoriesLastChangeDate.csv;
echo "Finished generating raw data" >> $svnLOGFILE
echo "Timestamp when work is finished $(date +"%D %T")" >> $svnLOGFILE
echo "Happy data analysis !" >> $svnLOGFILE
echo "***********************************************************************************" >> $svnLOGFILE
echo "/n" >> svnLOGFILE
on first line also i used also : #!/usr/bin/env
This is how i runn the script on the server directly on the bash shell command:
Run the SVN script like this
/data/audit/Collect-Usage-SVN.sh > /dev/null 2>&1
The below command is used in CMD terminal to copy the files from server on locall disk
Copy SVN and GIT raw files from SVN server to local Windows
scp root#user:/data/audit/2022-07-08-* .
(see example in Copy_from_SVN_server.jpeg)

Related

Cannot execute shell command with redirected multiline input using Python Paramiko

I am using Paramiko to do the standard SSH into a box, run commands, and display the STDOUT to my terminal. Due to sudo rules, I SSH into a machine with my username and run sudo /bin/su -s /bin/bash - <diff user account>. In my script, I am passing the following command into Paramiko but the STDOUT does not show on my screen. I believe this is because the sudo command is opening a new shell and Paramiko is watching the STDOUT on the new shell.
The commands DO run as I have logged onto the box and see the command history. How do I get the STDOUT of the commands I am running to show on my terminal?
import paramiko
def sshCommand(hostname, port, username, command, key_filename='/home/<my username>/.ssh/id_rsa'):
sshClient = paramiko.SSHClient()
sshClient.set_missing_host_key_policy(paramiko.AutoAddPolicy())
sshClient.connect(hostname=hostname, port=port, username=username, key_filename=key_filename)
stdin, stdout, stderr = sshClient.exec_command(command, get_pty=False)
output = stdout.readlines()
print(output)
sshCommand('<servername>', 22, '<my username>', """sudo /bin/su -s /bin/bash - <diff username> << \'EOF\'
echo "Hello World"
EOF
""")
I do not think that OpenSSH SSH server can accept the << 'EOF' shell construct on the "exec" channel.
But this should work:
echo echo "Hello World" | sudo /bin/su -s /bin/bash - <diff username>

Secure Copy (scp) the latest file which arrives at a given folder?

I need to write a script in bash/python to scp the latest file which arrives at a given folder.That is I am continously getting files into a folder say (/home/ram/Khopo/) I need to scp it into xxx#192.168.21.xxx in /home/xxx/khopo/.
I googled and got this result
file_to_copy=`ssh username#hostname 'ls -1r | head -1'`
echo copying $file_to_copy ...
scp username#hostname:$file_to_copy /local/path
But I want to know whether it is possible do this such that it runs only when a new folder arrives at the source(/home/ram/Khopo/) and waits for the file to reach the folder and do it immediately when it has arrived
I would try to sync the remote directory. This should give you nice outlook, how to do that:
rsync:
https://askubuntu.com/a/105860
https://www.atlantic.net/hipaa-compliant-cloud-storage/how-to-use-rsync-copy-sync-files-servers/
or other tools for syncing:
https://en.wikipedia.org/wiki/Comparison_of_file_synchronization_software
As others have suggested you can use inotifywait, below an example of what you could do in bash:
#!/bin/bash
echo "Enter ssh password"
IFS= read -rs password # Read the password in a hidden way
inotifywait -m -e create "/folder_where_files_arrive" | while read line
do
file_to_copy=$(echo $line | cut -d" " -f1,3 --output-delimiter="")
echo copying $file_to_copy ...
if [[ -d $file_to_copy ]]; then # is a directory
sshpass -p $password scp -r username#hostname:$file_to_copy /local/path
elif [[ -f $file_to_copy ]]; then # is a file
sshpass -p $password scp username#hostname:$file_to_copy /local/path
fi
done
Then you would ideally put this script to run in background, e.g.,:
nohup script.sh &
For sshpass you can install it in ubunut/debian with:
apt install sshpass

Unable to get the proper exit value of a python script, being run on a remote machine

I am new to programming. Please be kind. :)
I am trying to invoke a python script on a remote machine from a shell script on my local machine.
I knew that we can exit the python script with values ranging from 0 to 127. I am trying to exit the python script with a value of 3. I verified with a print and I see the exit value is proper on the remote machine.
But on my local machine, I always see the exit value of remote script as 0.
This is my shell script on the local machine.
sshpass -p password ssh -o StrictHostKeyChecking=no root#10.10.10.10 << EOF
cd /root/drm/myDir
./bla.py
echo $?
This is my python script on a remote machine:
import os
import sys
for curr_tc in range(1,10):
cmd = '..........'
os.system(cmd)
:
:
:
:
if 'PASSED' in lineList[-5]:
continue
else:
exit(curr_tc)
exit(0)
Please point my mistake. Thanks in advance.
The reason why this fails is that $? is expanded on the client side.
The best way to fix this is to not inspect the value on the server side at all, and instead let it get propagated to the client side:
sshpass -p password ssh -o StrictHostKeyChecking=no root#10.10.10.10 << EOF
cd /root/drm/myDir
./bla.py
EOF
echo "SSH relayed the exit code, look: $?"
This allows it to work with all forms of if statements, set -e, or other ways of inspecting exit codes on the client.
The alternative way is to make sure the $? is escaped by quoting the here document:
sshpass -p password ssh -o StrictHostKeyChecking=no root#10.10.10.10 << "EOF"
cd /root/drm/myDir
./bla.py
echo "The command exited with $?"
EOF
echo "SSH relayed the exit code of echo itself, check it out: $?"
This will print the exit code correctly, but ssh itself will always count it as a success because the last command, echo, successfully printed stuff.
You could rewrite without using EOF like so:
result=$(sshpass -p password ssh -o StrictHostKeyChecking=no root#10.10.10.10 'cd /root/drm/myDir;./bla.py;echo $?')
echo $result
This will run all the commands on the host including expansion of the $?. When using the EOF thing, the expansion happens on your local machine as that other guy noted. Tested on my machines with a simple bash script that just ran exit 3 and it worked.
You could then compare the result using a simple if statement like this:
if [ $result -eq 3 ]; then
echo "The process completed successfully."
else
echo "Process returned unexpected return code $result."
# Do whatever you need to if this happens.
fi

Executing remote python script in background over SSH

I have a python file "run.py" like below on my remote server.
import subprocess
subprocess.Popen(["nohup", "python", "/home/admin/Packet/application.py", "&"])
I want to run that file from my local computer using SSH. I'm trying like the below. However, my local terminal got stuck there. It seems it isn't being run in the background.
ssh -n -f -i /Users/aws/aws.pem admin#hello_world.com 'python /home/admin/run.py'
After running that command, my terminal got stuck.
The following is an example I'm using, you can try something like this, customizing the ssh_options.
import subprocess
ssh_options = '-o ConnectTimeout=10 -o PasswordAuthentication=no -o PreferredAuthentications=publickey -o StrictHostKeyChecking=no'
server_name = 'remote_server.domain'
cmd = 'ssh ' + ssh_options + ' ' + server_name + ' "/usr/bin/nohup /usr/bin/python /home/admin/run.py 2>&1 &"'
p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
Later you can redirect the output to a flat file, changing :
2>&1 &
for:
>> /path/lo/log_file.txt 2>&1 &

Unable to truly background an SSH tunnel using Fabric and nohup

I can't seem to get Fabric to play nice with backgrounding a process that I've used nohup on. . . It should be possible, given various pieces of information, including here and here.
def test():
h = 'xxxxx.compute-1.amazonaws.com'
ports = [16646, 9090, 6666]
with settings(host_string = h):
tun_s = "ssh -o StrictHostKeyChecking=no -i ~/.ssh/kp.pem %s#%s " % (env.user, h)
for port in ports:
p_forward = "-L %d:localhost:%d" % (port, port)
tun_s = "%s %s" % (tun_s, p_forward)
tun_s = "%s -N" % tun_s
# create the tunnel. . .
print "creating tunnel %s" % tun_s
run("nohup '%s' >& /dev/null < /dev/null &" % tun_s)
print "fin"
Abbreviated output:
ubuntu#domU-xxx:~/deploy$ fab test
executing on tunnel ssh -o StrictHostKeyChecking=no -i ~/.ssh/kp.pem ubuntu#xxx -L 16646:localhost:16646 -L 9090:localhost:9090 -L 6666:localhost:6666 -N
[xxx.compute-1.amazonaws.com] run: nohup 'ssh -o StrictHostKeyChecking=no -i ~/.ssh/kp.pem ubuntu#xxx.compute-1.amazonaws.com -L 16646:localhost:16646 -L 9090:localhost:9090 -L 6666:localhost:6666 -N' >& /dev/null < /dev/null &
fin
Done.
Disconnecting from xxxx
I know there is no problem with the tunnel command per se because if I strip away the nohup stuff it works fine (but obviously Fabric hangs). I'm pretty sure that it's not properly getting detached and when the run function returns the tunnel process is immediately dying.
But why?
This also happens with a python command in another part of my code.
So, it seems after much wrangling that this is not possible for whatever reason with my setup (default Ubuntu installs on EC2 instances). I have no idea why and as it seems possible according to various sources.
I fixed my particular problem by using Paramiko in place of Fabric, for calls that need to be left running in the background. The following achieves this:
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
privkey = paramiko.RSAKey.from_private_key_file('xxx.pem')
ssh.connect('xxx.compute-1.amazonaws.com', username='ubuntu', pkey=privkey)
stdin, stdout, stderr = ssh.exec_command("nohup ssh -f -o StrictHostKeyChecking=no -i ~/.ssh/xxx.pem ubuntu#xxx.compute-1.amazonaws.com -L 16646:localhost:16646 -L -N >& /dev/null < /dev/null &")
ssh.close()

Categories