Python subprocess rsync using sshpass and specify port - python

I've searched around for quite a bit, finding pieces of what I wish to achieve but not fully. I'm making a sync-script to synchronize files between two machines. The script itself is somewhat more advanced than this question (it provides possibility for both sides to request for file deletion and so on, no "master side").
First question
The following bash-command works for me:
rsync -rlvptghe 'sshpass -p <password> ssh -p <port>' <source> <destination>
how can I translate it into a python command to be used with the subprocess object?
I've managed to get the following python to work:
pw = getpass.getpass("Password for remote host: ")
command = ['sshpass', '-p', pw, 'rsync', '-rlvptgh', source, destination]
p = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
while p.poll() is None:
out = p.stdout.read(1)
sys.stdout.write(out)
sys.stdout.flush()
but it doesn't specify port (it uses standard 22, I want another one). To clarify, I wish to use similar code as this one but with the support for a specific port as well.
I have already tried to change the command to:
command = ['sshpass', '-p', pw, 'rsync', '-rlvptghe', 'ssh', '-p', '2222', source, destination]
which gives the following error:
ssh: illegal option -- r
and also many other variations such as for instance:
command = ['rsync', '-rlvptghe', 'sshpass', '-p', pw, 'ssh', '-p', '2222', source, destination]
Which gives the following error (where <source> is the remote host source host to sync from, ie variable source above command declaration):
Unexpected remote arg: <source>
How should I specify this command to nest them according to my first bash command?
Second question
When I've done all my searching I've found lots of frowning upon using a command containing the password for scp/rsync (ie ssh), which I use in my script. My reasoning is that I want to be prompted for a password when I do the synchronization. It is done manually since it gives feedback on filesystem modifications and other things. However, since I do 2 scp and 2 rsync calls I don't want to type the same password 4 times. That is why I use this approach and let python (the getpass module) collect the password one time and then use it for all the 4 logins.
If the script was planned for an automated setup I would of course use certificates instead, I would not save the password in clear text in a file.
Am I still reasoning the wrong way about this? Are there things I could do to strengthen the integrity of the password used? I've already realized that I should suppress errors coming from the subprocess module since it might display the command with the password.
Any light on the problem is highly appreciated!
EDIT:
I have updated question 1 with some more information as to what I'm after. I also corrected a minor copy + paste error in the python code.
Edit 2 explained further that I do have tried the exact same order as the first bash command. That was the first I tried. It doesn't work. The reason for changing the order was because it worked with another order (sshpass first) without specifying port.

I have found one way to solve this for my own needs. It includes invoking a shell to handle the command, which I avoided in the first place. It works for me though, but might not be satisfactory to others. It depends on the environment you want to run the command in. For me this is more or less an extension of the bash shell, I want to do some other things that are easier in python and at the same time run some bash commands (scp and rsync).
I'll wait for a while and if there's no better solution than this I will mark my answer as the answer.
A basic function for running rsync via python with password and port could be:
def syncFiles(pw, source, destination, port, excludeFile=None, dryRun=False, showProgress=False):
command = 'rsync -rlvptghe \'sshpass -p ' + pw + ' ssh -p ' + port + '\' ' + source + ' ' + destination
if excludeFile != None:
command += ' --exclude-from='+excludeFile
if dryRun:
command += ' --dry-run'
if showProgress:
command += ' --progress'
p = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE)
while p.poll() is None:
out = p.stdout.read(1)
sys.stdout.write(out)
sys.stdout.flush()
The reason this works is as I wrote because the invoked bash shell handles the command instead. This way I can write the command exactly as I would directly in a shell. I still don't know how to do this without shell=True.
Note that the password is collected from the user with the getpass module:
pw = getpass.getpass("Password for current user on remote host: ")
It is not recommended to store your password in the python file or any other file. If you are looking for an automated solution it is better to use private keys. Answers for such solutions can be found by searching.
To call the scp-command with password the following python should do:
subprocess.check_output(['sshpass', '-p', pw, 'scp', '-P', port, source, destination])
I hope this can be useful to someone who wants to achieve what I am doing.

Related

Become root user and execute command after performing ssh [duplicate]

I use a friends server that allows only one user to be logged from SSH, so normally I just log in as that user and then do su -l myuser to change accounts. I wanted to automate some boring stuff using Python, but I ran into problems with that. Apparently Paramiko module that I tried first invokes a single shell for every command, so that was out of the question. Later I tried using invoke_shell() to overcome that, but it still failed (I assume it's because changing user changes shell as well).
After that I found about Fabric module, but best I could do is open SSH shell with a proper user logged in, but without option to run any commands from code.
Is there any way to accomplish that? Final goal would probably look something like this:
ssh.login(temp_user, pass)
ssh.command("su -l myuser")
expect("Password: ", ssh.send("mypass\n")
ssh.command("somescript.sh > datadump.txt")
Using sudo is impossible, as well as adding passwordless login.
As suggested here is the code that I tried with Paramiko:
import paramiko
host = "hostip"
user = "user"
user_to_log = "myuser"
password = "pass"
password_to_log = "mypass"
login_command = "su -l " + user_to_log
ssh = paramiko.SSHClient()
ssh.load_system_host_keys()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostip, username=user,
password=password)
transport = ssh.get_transport()
session = transport.open_session()
session.set_combine_stderr(True)
session.get_pty()
session.exec_command("su -l " + user_to_log)
stdin = session.makefile('wb', -1)
stdin.write(password_to_log +'\n')
stdin.flush()
session.exec_command("whoami")
stdout = session.makefile('rb', -1)
for line in stdout.read().splitlines():
print('host: %s: %s' % (host, line))
su -c command won't work either, since server system doesn't support that option.
General disclaimers first (to others who stumble upon this question):
Using su is not the right solution. su is a tool intended for an interactive use, not for an automation. The correct solution is to login with the correct account directly.
Or at at least use a password-less sudo.
Or you can create a root-owned script with setuid right.
See also Allowing automatic command execution as root on Linux using SSH.
If you are stuck with su, on most systems you can use -c switch to su to specify a command:
su -c "whoami" user
See also How to run sudo with Paramiko? (Python)
If none of the above is feasible (and you really tried hard to make the admin enable some of the options above):
As the last resort option, you can write the command to a standard input of the su, the same way you already write a password (another thing not to do):
stdin, stdout, stderr = session.exec_command("su -l " + user_to_log)
stdin.write(password_to_log + '\n')
stdin.flush()
command = 'whoami'
stdin.write(command + '\n')
stdin.flush()
(also note that it's redundant to call makefile, as exec_command already returns that)
See Execute (sub)commands in secondary shell/command on SSH server in Python Paramiko.
Note that your question is not about which SSH client library to use. It does not matter if you use Paramiko or other. This all is actually a generic SSH/Linux/shell question.

Executing command using "su -l" in SSH using Python

I use a friends server that allows only one user to be logged from SSH, so normally I just log in as that user and then do su -l myuser to change accounts. I wanted to automate some boring stuff using Python, but I ran into problems with that. Apparently Paramiko module that I tried first invokes a single shell for every command, so that was out of the question. Later I tried using invoke_shell() to overcome that, but it still failed (I assume it's because changing user changes shell as well).
After that I found about Fabric module, but best I could do is open SSH shell with a proper user logged in, but without option to run any commands from code.
Is there any way to accomplish that? Final goal would probably look something like this:
ssh.login(temp_user, pass)
ssh.command("su -l myuser")
expect("Password: ", ssh.send("mypass\n")
ssh.command("somescript.sh > datadump.txt")
Using sudo is impossible, as well as adding passwordless login.
As suggested here is the code that I tried with Paramiko:
import paramiko
host = "hostip"
user = "user"
user_to_log = "myuser"
password = "pass"
password_to_log = "mypass"
login_command = "su -l " + user_to_log
ssh = paramiko.SSHClient()
ssh.load_system_host_keys()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostip, username=user,
password=password)
transport = ssh.get_transport()
session = transport.open_session()
session.set_combine_stderr(True)
session.get_pty()
session.exec_command("su -l " + user_to_log)
stdin = session.makefile('wb', -1)
stdin.write(password_to_log +'\n')
stdin.flush()
session.exec_command("whoami")
stdout = session.makefile('rb', -1)
for line in stdout.read().splitlines():
print('host: %s: %s' % (host, line))
su -c command won't work either, since server system doesn't support that option.
General disclaimers first (to others who stumble upon this question):
Using su is not the right solution. su is a tool intended for an interactive use, not for an automation. The correct solution is to login with the correct account directly.
Or at at least use a password-less sudo.
Or you can create a root-owned script with setuid right.
See also Allowing automatic command execution as root on Linux using SSH.
If you are stuck with su, on most systems you can use -c switch to su to specify a command:
su -c "whoami" user
See also How to run sudo with Paramiko? (Python)
If none of the above is feasible (and you really tried hard to make the admin enable some of the options above):
As the last resort option, you can write the command to a standard input of the su, the same way you already write a password (another thing not to do):
stdin, stdout, stderr = session.exec_command("su -l " + user_to_log)
stdin.write(password_to_log + '\n')
stdin.flush()
command = 'whoami'
stdin.write(command + '\n')
stdin.flush()
(also note that it's redundant to call makefile, as exec_command already returns that)
See Execute (sub)commands in secondary shell/command on SSH server in Python Paramiko.
Note that your question is not about which SSH client library to use. It does not matter if you use Paramiko or other. This all is actually a generic SSH/Linux/shell question.

How to pass commands to an SSH from subprocess in Python

I have used the subprocess module in Python 2.7.6 to establish an SSH. I realise that this is not recommended, but I am unable to install other Python SSH libraries such as paramiko and fabric.
I was just wondering if someone wouldn't mind just telling me how I'd now go about
sshProcess = subprocess.call(['ssh', '-t', '<REMOTE>', 'ssh', '<REMOTE>'])
I want to carry out commands in REMOTE with the subprocess approach. Is there any way to do this? Unfortunately, REMOTE is protected by a password which the user manually enters. If it helps, I'm running the Windows 10 Bash shell.
Any help is appreciated.
Running a remote command is as simple as putting it on the command line. (This is distinguishable to the SSH server at a protocol level from feeding it on stdin, but the protocol in question is built for programmatic use, vs built for human use -- as the latter was the design intent behind the interactive-shell model).
By the way, if you want to run multiple commands via distinct SSH invocations over a single connection after authenticating only once, I'd strongly suggest using Paramiko for this, but you can do it with OpenSSH command-line tools by using SSH multiplexing support.
Let's say you have an array representing your remote command:
myCommand = [ 'ls', '-l', '/tmp/my directory name with spaces' ]
To get that into a string (in a way that honors the spaces and can't let a maliciously-selected name run arbitrary commands on the remote server), you'd use:
myCommandStr = ' '.join(pipes.quote(n) for n in myCommand)
Now, you have something you can pass as a command line argument to ssh:
subprocess.call(['ssh', '-t', hostname, myCommandStr])
However, let's say you want to nest this. You can just repeat the process:
myCommand = [ 'ssh', '-t', hostname1, myCommandStr ]
myCommandStr = ' '.join(pipes.quote(n) for n in myCommand)
subprocess.call(['ssh', '-t', hostname2, myCommandStr])
Because we aren't redirecting stdin or stdout, they should still be pointed at the terminal from which your Python program was started, so SSH should be able to execute its password prompts directly.
That said, specifically for ssh'ing through an interim system, you don't need to go through this much trouble: You can tell ssh to do that work for you with the ProxyJump option:
myCommand = [ 'ls', '-l', '/tmp/my directory name with spaces' ]
myCommandStr = ' '.join(pipes.quote(n) for n in myCommand)
subprocess.call(['ssh', '-o', 'ProxyJump=%s' % hostname1, hostname2, myCommandStr])
From your comment, you say you can connect. So after that, to interact over ssh using subprocess you will need something like:
ssh = subprocess.Popen(['ssh', <remote client>],
stdin=subprocess.PIPE,
stdout = subprocess.PIPE)
back = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
print error
else:
print back
then to send commands, like list directory, something like:
ssh.stdin.write("ls\n")

Automating switching from normal user to root with subproccess in python

Question:
I am trying to execute a cmd which reads from a PostgreSQL db. I am able to manually switch to root, then switch to the postgre user and access the information I desire.
The problem I have is that when I run this, it just hangs and nothing happens.
I have the root password and will need this when switching from the current user But I am not being prompted to enter it.
How can I get this not to hang and the password be prompted?
The code below only executes 'ls' for simplicity.
Code:
def popen_cmd_shell(command):
print command
process = subprocess.Popen(command,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
shell=True)
proc_stdout = process.communicate()[0].strip()
return proc_stdout
if __name__ == '__main__':
querylist = popen_cmd_shell('su - root; ls;')
print querylist
Update One:
I am unable to use any library that dose not come with python 2.7 on Linux SUSE. I just need to execute a command and exit.
Update Two:
I am unable to run the script as root as I need to perform other tasks which require me not to be root.
Update Three:
As per LeBarton suggestions I have got the script to log into root, although the the ls command never gets executed as root, it gets executed as the user I originally was. When I run the command I get prompted to enter the root password and get transfered from "#host" to "host" who cannot execute any command other than exit. When I exit all the commands executed output appears.
I do not wish to store the user password in the code as LeBarton has it. How can I execute a command as root and return back and continue the rest of the script, without getting locked into the new users and needing to type 'exit'.
The "stderr=subprocess.STDOUT" seems to have been what was causing it to hang.
Code:
if __name__ == '__main__':
def subprocess_cmd(command):
process = subprocess.Popen(command,stdout=subprocess.PIPE, shell=True)
proc_stdout = process.communicate()[0].strip()
print proc_stdout
subprocess_cmd('echo a; su - root; ls; cd;ls;')
...continue with rest of script where I execute commands as original user
Answer:
Thanks to tripleee for his excellent answer.
I Have achieved what I set out to do with the follwoing code:
if __name__ == '__main__':
def subprocess_cmd(command):
process = subprocess.Popen(command,stdout=subprocess.PIPE, shell=False)
proc_stdout = process.communicate()[0].strip()
print proc_stdout
subprocess_cmd(['su','-','root','-c','su -s /bin/sh postgres -c \'psql -U msa ..........])
I just needed to the place the command I was executing as root after -c. So it now switches to the postgres user and finds the data it needs from root returning to the normal user after.
You are misunderstanding what su does. su creates a privileged subprocess; it does not change the privileges of your current process. The commands after su will be executed after su finishes, with your normal privileges.
Instead, you want to pass a -c option to specify the commands you want to run with elevated privileges. (See also the su man page.)
popen_cmd_shell('su -c ls - root')
sudo was specifically designed to simplify this sort of thing, so you should probably be using that instead.
Scripted access to privileged commands is a sticky topic. One common approach is to have your command perform the privileged operation, then drop its privileges. Both from a security and a design point of view, this approach tends to simplify the overall logic. You need to make sure your privileged code is as simple and short as possible--no parsing in the privileged section, for example.
Once the privileged code is properly tested, audited, and frozen, bestowing the script with the required privileges should be a simple matter (although many organizations are paranoid, and basically unable to establish a workable policy for this sort of thing).
Regardless of which approach you take, you should definitely avoid anything with shell=True in any security-sensitive context, and instead pass any external commands as a list, not as a single string.
popen_cmd_shell(['su', '-c', 'ls', '-', 'root'])
(Maybe also rename the function, since you specifically do not want a shell. Obviously, also change the function to specify shell=False.)
Again, these security concerns hold whether you go with dropping privileges, or requesting privilege escalation via su or sudo.
Your command is this
su - root; ls;
The shell is interpretting it as this
"su -root; ls;"
You probably don't have an executable in your with that exact name with spaces.
Try separating it into as list with
['su', '-', 'root', ';', 'ls', ';' ]
EDIT
stdout=subprocess.PIPE, is causing the program to hang. If you are trying to pass the password in, using process.communicate('root password') works.
Do you just want to access the PostgreSQL database? If so, you don't need to use command line at all...
The Python library psycopg2 will allow to send commands to the PostgreSQL server, more on that here: https://wiki.postgresql.org/wiki/Psycopg2_Tutorial
However I recommend an ORM such as SQLAlchemy, they make communicating with a database a trivial task.

Persistent ssh session in Python using Popen

I am creating a movie controller (Pause/Stop...) using python where I ssh into a remote computer, and issue commands into a named pipe like so
echo -n q > ~/pipes/pipename
I know this works if I ssh via the terminal and do it myself, so there is no problem with the setup of the named pipe redirection. My problem is that setting up an ssh session takes time (1-3 seconds), whereas I want the pause command to be instantaneous. Therefore, I thought of setting up a persistent pipe like so:
controller = subprocess.Popen ( "ssh -T -x <hostname>", shell = True, close_fds = True, stdin=subprocess.PIPE, stderr=subprocess.PIPE, stdout=subprocess.PIPE )
Then issue commands to it like so
controller.stdin.write ( 'echo -n q > ~/pipes/pipename' )
I think the problem is that ssh is interactive so it expects a carriage return. This is where my problems begin, as nearly everyone who has asked this question has been told to use an existing module:
Vivek's answer
Chakib's Answer
shx2's Answer
Crafty Thumber's Answer
Artyom's Answer
Jon W's Answer
Which is fine, but I am so close. I just need to know how to include the carriage return, otherwise, I have to go learn all these other modules, which mind you is not trivial (for example, right now I can't figure out how pexpect uses either my /etc/hosts file or my ssh keyless authentications).
To add a newline to the command, you will need to add a newline to the string:
controller.stdin.write('\n')
You may also need to flush the pipe:
controller.stdin.flush()
And of course the controller has to be ready to receive new data, or you could block forever trying to send it data. (And if the reason it's not ready is that it's blocking forever waiting for you to read from its stdout, which is possible on some platforms, you're deadlocked unrecoverably.)
I'm not sure why it's not working the way you have it set up, but I'll take a stab at this. I think what I would do is change the Popen call to:
controller = subprocess.Popen("ssh -T -x <hostname> \"sh -c 'cat > ~/pipes/pipename'\"", ...
And then simply controller.stdin.write('q').

Categories