How to execute scp locally using fabric - python

I am using fabric right now to upload a large directory of files to ~100 different servers. These servers don't have rsync installed, so that's out of the question. Right now, I am using the upload_project method to accomplish this which is working well on the test servers I have, but I have heard that ftp (even if it's sftp) isn't allowed on some of these servers and that I might also need to limit the bandwidth of the transfer. To avoid these problems, I am trying to use scp, however, I am having some issues. I originally thought I could just go by the code in the rsync_project method and do something like local("scp -r %s %s" % (local_str, remote_str). However, scp still wants a password. So, I echoed the env.password to scp, which worked, but then it needed me to type yes to accept the key. I know I could just echo the password and echo yes to any key prompt, but I was wondering if there was some other way to accomplish this without all of the prompts. Also, sadly, the version of fabric on the server (which I cannot update) is a bit behind (1.6ish) and doesn't have the context-lib functionality that can handle prompts.

I was wondering if there was some other way to accomplish this without all of the prompts
The answer is not exactly related to Fabric. In the case of your scp command, you should look into the StrictHostKeyChecking option, to skip the known hosts key check. Example usage:
scp $src $dest -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null
It sounds like you've addressed the issue with the password prompt; was there anything else you were trying to solve here?

Related

How to pass an argument to the subprocess that expects input

I would like to execute a "git push" command using Python script. I was hoping to achieve this by using python's subprocess.Popen() method. However when I invoke the "git push" command, the ssh agent asks me for my passphrase in order to be able to push any changes to the repository.
And here is my question - is there any way for me to be able to pass my passphrase which is an input expected by the ssh process not the git itself?
process = subprocess.Popen("git push", stdin=subprocess.PIPE, stdout=subprocess.PIPE, shell=False)
pin = process.stdin.write("Passphrase")
I acknowledge that it might be easier to get rid of this ssh-agent's passphrase message everytime I git push or pull, however just for the sake of knowing I was curious is there a way to achieve it this way.
Note that GitPython issue 559 mentioned in 2016:
As far as I know, it is not possible to use keys with passphrases with GitPython due to the way git is invoked by it.
It simply does not connect standard input, which usually would be required to read the passphrase. The preferred way to do it would be to use a specific key without passphrase.
If that is not an option, git will also invoke the program pointed to by GIT_ASKPASS to read a password for use with the ssh key.
There is however a possible workaround, initially applied to GPG:
How to sign commits using the GitPython package
See 03_sign_commit_using_the_gitpython_package.py.
But again, for GPG. Check if that can be adapted for SSH.

How do I execute ssh via Robot Framework's Process library

I'm at a loss for what to think, here. There must be something I'm missing since I'm able to execute this perfectly fine in another, similar context, but not here. I am trying to run the following console command via Robot Framework for the sake of testing my system's reaction to a network outage: ssh it#128.0.0.1 sudo ifconfig ens192 down (the user & host ip are modified for posting here.)
To accomplish this, I'm using the following line of code:
Run Process 'ssh it#128.0.0.1 sudo ifconfig ens192 down' shell=True stdout=stdout stderr=stderr
Now, I've executed this myself in console and saw that the result was an inability to ping the ip, which is what I want. So I know for certain that the command itself isn't a problem. What is a problem is the fact that I get the following result: /bin/sh: ssh it#128.0.0.1 sudo ifconfig ens192 down: command not found
That's fine, though, because I believe this is a simple issue of the ssh instance not having the it user's $PATH. For reasons described below, I'm going to guess its $PATH variable is empty. Thus, I modified the keyword call as such:
Run Process 'ssh it#128.0.0.1 PATH\=/usr/sbin sudo ifconfig ens192 down' shell=True stdout=stdout stderr=stderr
The result of this command is as follows: /bin/sh: ssh it#128.0.0.1 PATH=/usr/sbin sudo ifconfig ens192 down: No such file or directory, which I believe to be referring to the items in the PATH attribute. I've tried several different things to determine what the problem could be, such as why it would be unable to find these files in an instance where, executed manually in the console, these directories are indeed present.
Given that I have less than a year's worth of experience with Robot Framework, I'm certain that there's just something I'm not understanding about the Process library. Could I have an explanation as to what I am missing for this execution and, if possible, an explanation as to how I should be executing an ssh command via Robot Framework?
A few extra things worth pointing out:
The it user is brand new, having been created via useradd it and copying the home directory contents into its new space.
I have already added an entry into the device's sudoers file to allow the it user to execute ifconfig without requiring password entry.
On that note, I've sent the executing user's ssh keys over to the device such that it does not require a password to log into it as that user.
I have attempted to execute several basic commands. I have intermittently gotten ls to work but have been unable to get a greater sense of what $PATH the Robot Framework ssh instance has because echo doesn't seem to work.

Automating jobs on remote servers with python2.7

I need to make a python script that will do these steps in order, but I'm not sure how to go about setting this up.
SSH into a server
Copy a folder from point A to point B (cp /foo/bar/folder1 /foo/folder2)
mysql -u root -pfoobar (This database is accessible from localhost only)
create a database, do some other mysql stuff in the mysql console
Replaces instances of Foo with Bar in file foobar
Copy and edit a file
Restart a service
The fact that I have to ssh into a server, and THEN do all of this is really confusing me. I looked into the Fabric library, but that seems to do only do 1 command at a time and doesn't keep context from previous commands.
I looked into the Fabric library, but that seems to do only do 1 command at a time and doesn't keep context from previous commands.
Look into Fabric more. It is still probably what you want.
This page has a lot of good examples.
By "context" I'm assuming you want to be able to cd into another directory and run commands from there. That's what fabric.context_managers.cd is for -- search for it on that page.
Sounds like you are doing some sort of remote deployment/configuring. There's a whole world of tools out there to professionally set this up, look into Chef and Puppet.
Alternatively if you're just looking for a quick and easy way of scripting some remote commands, maybe pexpect can do what you need.
Pexpect is a pure Python module for spawning child applications; controlling them; and responding to expected patterns in their output.
I haven't used it myself but a quick glance at its manual suggests it can work with an SSH session fine: https://pexpect.readthedocs.org/en/latest/api/pxssh.html
I have never used Fabric.
My way to solve those kind of issues (before starting to use saltstack) it was using pyexpect, to run the ssh connection, and all the commands that were needed.
maybe the use of a series of sql scripts to work with the database (just to make it easier) would help.
Another way, since you need to access the remote server using ssh, it would be using paramiko to connect and execute commands remotely. It's a bit more complicated when you want to see what's happening on stdout (while with pexpect you will see exactly what's going on).
but it all depends from what you really need.

Python Fabric: executing interactive program such as less on remote?

When I execute something like:
run('less <somefile>')
Within fabric, it prepends the lines with Out: and interacting with it doesn't work as expected.
If I run it with:
run('cat <something>', pty=False)
The output isn't prepended with anything and I can actually pipe that into less locally, like:
fab less | less
However I'm not sure if that's recommended since I feel it may be taxing on the remote resource since cat will continually be piping back through ssh. Also when I quick less before the whole file is cat'd (it could be over 1GB), I get a broker pipe error.
What would be the recommended way to facilitate this? Should I just use ssh directly like:
ssh <remote host> less <something>
If you're doing interactive work on the remote host, then maybe just using SSH is fine. I think fabric is mostly useful when automating actions.

How does python fabric protect ssh credentials?

So I've recently stumbled upon python fabric api and have been really happy with how it can help me with day-to-day sysadmin tasks. I would like to start using it at work but it is a very security-conscious environment. I was wondering how fabric handles the ssh password you provide to it while it runs it's tasks? I'm assuming it plonks it in memory somewhere and pulls it out when required to login to the next host in env.hosts? How does it protect this password while in memory?
I can see I'm going to be asked lots of questions along these lines so I'm looking for a nice way to explain to security-minded type of people that fabric is nice and friendly and doesn't pose a risk or at least no more of a risk than anything else we already have :)
I looked briefly through the source #dm03514 referenced and I believe you are correct in that if and when fabric needs to prompt interactively for a password, it will read it into memory and store it for the duration of the fabric python process. The way to address your concern is not with fabric itself but with ensuring your ssh infrastructure is using keys instead of passphrases and ssh agent forwarding where appropriate. Use enrypted ssh keys and ssh-agent to unlock them and fabric will be able to utilize that same mechanism and thus avoid ssh passwords getting involved at all. For sudo passwords, you'll either have to allow passwordless sudo or accept the risk of fabric having the sudo password in memory while it is working.

Categories