I have a python file "run.py" like below on my remote server.
import subprocess
subprocess.Popen(["nohup", "python", "/home/admin/Packet/application.py", "&"])
I want to run that file from my local computer using SSH. I'm trying like the below. However, my local terminal got stuck there. It seems it isn't being run in the background.
ssh -n -f -i /Users/aws/aws.pem admin#hello_world.com 'python /home/admin/run.py'
After running that command, my terminal got stuck.
The following is an example I'm using, you can try something like this, customizing the ssh_options.
import subprocess
ssh_options = '-o ConnectTimeout=10 -o PasswordAuthentication=no -o PreferredAuthentications=publickey -o StrictHostKeyChecking=no'
server_name = 'remote_server.domain'
cmd = 'ssh ' + ssh_options + ' ' + server_name + ' "/usr/bin/nohup /usr/bin/python /home/admin/run.py 2>&1 &"'
p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
Later you can redirect the output to a flat file, changing :
2>&1 &
for:
>> /path/lo/log_file.txt 2>&1 &
Related
I came to you for some helps, I have one Python script that collects my machines IP of my network (192.168.1.X to 192.168.1.Y).
For information, i'm working on Linux Ubuntu 16.04 and Python version is 2.7
So, this is my Python script execution (with collected IP as string for ip_list) :
import subprocess
ip_list = "192.168.1.1,192.168.1.2,192.168.1.3,192.168.1.4,192.168.1.5"
cmd2 = ("fab -f /home/user/fabfile.py -H " + ip_list + " -u user -p password auto_install")
proc2 = subprocess.Popen(cmd2, shell=True, stdout=subprocess.PIPE)
My Fabric script is just a script in bash that I put in the machines and I execute it, here is a sample :
put("/home/user/bash_script","/home/user/bash_script",False)
sudo('bash /home/user/bash_script')
BUT ... this error appears in terminal when i do sudo python mypythonscript.py :
close failed in file object destructor:
sys.excepthook is missing
lost sys.stderr
/bin/sh: 2: -u: not found
Ok I got it, you saved me ! #Bodo
I didn't make a print of my cmd2 but it appears that there is a \n between ip_list and -u
So i did cmd2=cmd2.replace("\n","") and now i have another problem that i will check out.
Thank you :)
I need to execute the following command from Python on Windows:
psql -h localhost -p 5432 -U postgres -f script.sql db_name
The above script works fine when ran from git bash / powershell. After entering the script in a terminal, I need to provide a password to confirm it (similar to when using sudo).
How can I do that? I keep finding solutions that I think are linux-based.
How do I do it on Windows? I have tried many variations of solutions involving subprocess, i.e:
import subprocess
p2 = subprocess.Popen(
'psql -h localhost -p 5432 -U postgres -f script.sql db_name',
stdin=subprocess.PIPE,
stderr=subprocess.PIPE,
universal_newlines=True)
print('this will print')
sudo_prompt = p2.communicate('THE_PASSWORD' + '\n')[1]
print('this will not')
A better option (more secure) than invoking psql with explicit mention of your password is to have a .pgpass file as described in the docs file (and keep it protected e.g. chmod 600 ~/.pgpass). This keeps your password out of the list of running processes.
On Windows:
On Microsoft Windows the file is named %APPDATA%\postgresql\pgpass.conf (where %APPDATA% refers to the Application Data subdirectory in the user's profile).
I need to ssh into a machine via a bastion. Therefore the command is rather very long for this:
ssh -i <pemfile location> -A -o 'proxycommand ssh -i <pemfile location> ec2-user#<bastion ip address> -W %h:%p' hadoop#<machine ip>
This command is rather very long. So I tried to write a python script which takes ip addresses and pemfile location as inputs and does ssh.
#!/usr/local/bin/python3
import argparse
import subprocess
import os
import sys
import errno
parser = argparse.ArgumentParser(description="Tool to ssh into EMR via a bastion host")
parser.add_argument('master', type=str, help='IP Address of the EMR master-node')
parser.add_argument('bastion', type=str, help='IP Address of bastion EC2 instance')
parser.add_argument('pemfile', type=str, help='Path to the pemfile')
args = parser.parse_args()
cmd_list = ["ssh", "-i", args.pemfile, "-A", "-o", "'proxycommand ssh -i {} ec2-user#{} -W %h:%p'".format(args.pemfile, args.bastion), "hadoop#{}".format(args.master)]
command = ""
for w in cmd_list:
command = command + " " + w
print("")
print("Executing command : ", command)
print("")
subprocess.call(cmd_list)
I get the following error when I run this script :
command-line: line 0: Bad configuration option: 'proxycommand
But I am able to run the exact command via bash.
Why is the ssh from python script failing then?
You are making the (common) mistake of mixing syntactic quotes with literal quotes. At the command line, the shell removes any quotes before passing the string to the command you are running; you should simply do the same.
cmd_list = ["ssh", "-i", args.pemfile, "-A",
"-o", "proxycommand ssh -i {} ec2-user#{} -W %h:%p".format(
args.pemfile, args.bastion), "hadoop#{}".format(args.master)]
See also When to wrap quotes around a shell variable? for a discussion of how quoting works in the shell, and perhaps Actual meaning of 'shell=True' in subprocess as a starting point for the Python side.
However, scripting interactive SSH sessions is going to be brittle; I recommend you look into a proper Python library like Paramiko for this sort of thing.
I'm developing an application in which I interact with docker containers.
I want to execute this command in a docker exec name_of_container command fashion:
command= "/usr/bin/balance -b "+ ip_address + " 5001 " + servers_list
The idea is to do an echo command >> /etc/supervisor/conf.d/supervisord.conf`
I tried as follows :
p=subprocess.Popen(['docker','exec','supervisor','echo','command'],
stdin=subprocess.PIPE,
stdout=subprocess.PIPE)
but it does not work.
This is the error code:
exec: "'/usr/bin/balance -b 195.154.12.1 5001 192.186.13.1' >> /etc/supervisor/conf.d/supervisord.conf": stat '/usr/bin/balance -b 195.154.12.1 5001 192.186.13.1' >> /etc/supervisor/conf.d/supervisord.conf: no such file or directory
I can't seem to get Fabric to play nice with backgrounding a process that I've used nohup on. . . It should be possible, given various pieces of information, including here and here.
def test():
h = 'xxxxx.compute-1.amazonaws.com'
ports = [16646, 9090, 6666]
with settings(host_string = h):
tun_s = "ssh -o StrictHostKeyChecking=no -i ~/.ssh/kp.pem %s#%s " % (env.user, h)
for port in ports:
p_forward = "-L %d:localhost:%d" % (port, port)
tun_s = "%s %s" % (tun_s, p_forward)
tun_s = "%s -N" % tun_s
# create the tunnel. . .
print "creating tunnel %s" % tun_s
run("nohup '%s' >& /dev/null < /dev/null &" % tun_s)
print "fin"
Abbreviated output:
ubuntu#domU-xxx:~/deploy$ fab test
executing on tunnel ssh -o StrictHostKeyChecking=no -i ~/.ssh/kp.pem ubuntu#xxx -L 16646:localhost:16646 -L 9090:localhost:9090 -L 6666:localhost:6666 -N
[xxx.compute-1.amazonaws.com] run: nohup 'ssh -o StrictHostKeyChecking=no -i ~/.ssh/kp.pem ubuntu#xxx.compute-1.amazonaws.com -L 16646:localhost:16646 -L 9090:localhost:9090 -L 6666:localhost:6666 -N' >& /dev/null < /dev/null &
fin
Done.
Disconnecting from xxxx
I know there is no problem with the tunnel command per se because if I strip away the nohup stuff it works fine (but obviously Fabric hangs). I'm pretty sure that it's not properly getting detached and when the run function returns the tunnel process is immediately dying.
But why?
This also happens with a python command in another part of my code.
So, it seems after much wrangling that this is not possible for whatever reason with my setup (default Ubuntu installs on EC2 instances). I have no idea why and as it seems possible according to various sources.
I fixed my particular problem by using Paramiko in place of Fabric, for calls that need to be left running in the background. The following achieves this:
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
privkey = paramiko.RSAKey.from_private_key_file('xxx.pem')
ssh.connect('xxx.compute-1.amazonaws.com', username='ubuntu', pkey=privkey)
stdin, stdout, stderr = ssh.exec_command("nohup ssh -f -o StrictHostKeyChecking=no -i ~/.ssh/xxx.pem ubuntu#xxx.compute-1.amazonaws.com -L 16646:localhost:16646 -L -N >& /dev/null < /dev/null &")
ssh.close()